* Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
@ 2018-08-09 6:20 Wodkowski, PawelX
0 siblings, 0 replies; 9+ messages in thread
From: Wodkowski, PawelX @ 2018-08-09 6:20 UTC (permalink / raw)
To: spdk
[-- Attachment #1: Type: text/plain, Size: 9494 bytes --]
I think you need to add
-numa node,memdev=mem0
to QEMU command line options.
Also consider adding ‘prealloc=yes,host-nodes=0,policy=bind’ to ‘-object’
Thanks.
From: SPDK [mailto:spdk-bounces(a)lists.01.org] On Behalf Of Adam Chang
Sent: Thursday, August 9, 2018 4:05 AM
To: spdk(a)lists.01.org
Subject: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
Hi all:
I just create NVMe bdev and vhost-scsi controller which can be accessed by QEMU, but it occurred error when IO issued from VM.
Here are my steps for SPDK configuration
Host OS:Ubuntu 18.04, Kernel 4.15.0-30
Guest OS: Ubuntu 18.04
QEMU: 2.12.0
SPDK: v18.07
1) sudo HUGEMEM=4096 scripts/setup.sh
0000:05:00.0 (8086 2522): nvme -> vfio-pci
Current user memlock limit: 4116 MB
This is the maximum amount of memory you will be
able to use with DPDK and VFIO if run as current user.
To change this, please adjust limits.conf memlock limit for current user.
2) sudo ./app/vhost/vhost -S /var/tmp -m 0x3 &
[ DPDK EAL parameters: vhost -c 0x3 -m 1024 --legacy-mem --file-prefix=spdk_pid1921 ]
EAL: Detected 12 lcore(s)
EAL: Detected 1 NUMA nodes
EAL: Multi-process socket /var/run/dpdk/spdk_pid1921/mp_socket
EAL: No free hugepages reported in hugepages-1048576kB
EAL: Probing VFIO support...
EAL: VFIO support initialized
app.c: 530:spdk_app_start: *NOTICE*: Total cores available: 2
reactor.c: 718:spdk_reactors_init: *NOTICE*: Occupied cpu socket mask is 0x1
reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on core 1 on socket 0
reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on core 0 on socket 0
3) sudo ./scripts/rpc.py construct_vhost_scsi_controller --cpumask 0x1 vhost.0
EAL: PCI device 0000:05:00.0 on NUMA socket 0
EAL: probe driver: 8086:2522 spdk_nvme
EAL: using IOMMU type 1 (Type 1)
Nvme0n1
4) sudo ./scripts/rpc.py add_vhost_scsi_lun vhost.0 0 Nvme0n1
5) start qemu:
taskset qemu-system-x86_64 -enable-kvm -m 1G \
-name bread,debug-threads=on \
-daemonize \
-pidfile /var/log/bread.pid \
-cpu host\
-smp 4,sockets=1,cores=4,threads=1 \
-object memory-backend-file,id=mem0,size=1G,mem-path=/dev/hugepages,share=on -numa node,memdev=mem0\
-drive file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
-chardev socket,id=char0,path=/var/tmp/vhost.0 \
-device vhost-user-scsi-pci,id=scsi0,chardev=char0\
-machine usb=on \
-device usb-tablet \
-device usb-mouse \
-device usb-kbd \
-vnc :2 \
-net nic,model=virtio\
-net user,hostfwd=tcp::2222-:22
then when I use fio to test the vhost nvme disk in guest VM, I got the following error message in host console.
===========================================================================
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:32
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:32
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:32
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:32
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:32
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:32
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
===========================================================================
I used the lsblk to check block device information in guest, and could see the nvme disk with sdb.
>lsblk --output "NAME,KNAME,MODEL,HCTL,SIZE,VENDOR,SUBSYSTEMS"
===========================================================================
NAME KNAME MODEL HCTL SIZE VENDOR SUBSYSTEMS
fd0 fd0 4K block:platform
loop0 loop0 12.2M block
loop1 loop1 86.6M block
loop2 loop2 1.6M block
loop3 loop3 3.3M block
loop4 loop4 21M block
loop5 loop5 2.3M block
loop6 loop6 13M block
loop7 loop7 3.7M block
loop8 loop8 2.3M block
loop9 loop9 86.9M block
loop10 loop10 34.7M block
loop11 loop11 87M block
loop12 loop12 140.9M block
loop13 loop13 13M block
loop14 loop14 140M block
loop15 loop15 139.5M block
loop16 loop16 3.7M block
loop17 loop17 14.5M block
sda sda QEMU HARDDISK 0:0:0:0 32G ATA block:scsi:pci
sda1 sda1 32G block:scsi:pci
sdb sdb NVMe disk 2:0:0:0 27.3G INTEL block:scsi:virtio:pci
sr0 sr0 QEMU DVD-ROM 1:0:0:0 1024M QEMU block:scsi:pci
===========================================================================
Does anyone can give me help how to solve this problem ?
Thanks.
Adam Chang
[-- Attachment #2: attachment.html --]
[-- Type: text/html, Size: 32147 bytes --]
^ permalink raw reply [flat|nested] 9+ messages in thread
* Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
@ 2018-08-10 8:54 Adam Chang
0 siblings, 0 replies; 9+ messages in thread
From: Adam Chang @ 2018-08-10 8:54 UTC (permalink / raw)
To: spdk
[-- Attachment #1: Type: text/plain, Size: 71556 bytes --]
Hi all:
After apply the patch testing, the IO from VM can issue to vhost scsi NVMe,
Thank you for help!
Adam Chang.
On Fri, Aug 10, 2018 at 1:14 PM Stojaczyk, DariuszX <
dariuszx.stojaczyk(a)intel.com> wrote:
> > [ 4640.033876] DMAR: intel_iommu_map: iommu width (39) is not sufficient
> for
> > the mapped address (7fdce8200000)
>
> Thanks. It is SPDK's fault. I already pushed a patch to fix it:
> https://review.gerrithub.io/c/spdk/spdk/+/421697
> D.
>
> > -----Original Message-----
> > From: SPDK [mailto:spdk-bounces(a)lists.01.org] On Behalf Of Adam Chang
> > Sent: Friday, August 10, 2018 7:02 AM
> > To: Storage Performance Development Kit <spdk(a)lists.01.org>
> > Subject: Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
> >
> > Hi:
> > Here are My host environment
> > ==================================================================
> > Host OS: Ubuntu 18.04 x86_64
> > Linux Kernel: 4.15.0-30
> > CPU: Intel i7 8700K
> > Memory: 32GB
> > NVME SSD: Intel Optane Memory 32GB
> > ==================================================================
> > configuration for building QEMU:
> > ==================================================================
> > ./configure --prefix=/usr --target-list=x86_64-softmmu --enable-kvm
> --enable-
> > debug --enable-debug-info --enable-modules --enable-linux-aio
> --enable-vnc --
> > enable-trace-backends=log --enable-numa --disable-werror --disable-strip
> --with-
> > sdlabi=2.0
> > ==================================================================
> >
> > configuration for building SPDK:
> > ==================================================================
> > ./configure --enable-debug
> >
> > ==================================================================
> >
> > I checked the dmesg, it showed the following error
> >
> > ==================================================================
> >
> > [ 4640.033876] DMAR: intel_iommu_map: iommu width (39) is not sufficient
> for
> > the mapped address (7fdce8200000)
> >
> > ==================================================================
> >
> >
> > I add log option when start the vhost target as follow:
> > ==================================================================
> > ./app/vhost/vhost -S /var/tmp -m 0x3 -L vhost vhost_scsi &
> > ==================================================================
> >
> >
> > And here are my vhost log
> > ==================================================================
> > VHOST_CONFIG: new vhost user connection is 18
> > VHOST_CONFIG: new device, handle is 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_PROTOCOL_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_PROTOCOL_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_QUEUE_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_OWNER
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:0 file:25
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:1 file:26
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:2 file:27
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > VHOST_CONFIG: guest memory region 0, size: 0x40000000
> > guest physical addr: 0x0
> > guest virtual addr: 0x7f4754600000
> > host virtual addr: 0x7fdce8000000
> > mmap addr : 0x7fdce8000000
> > mmap size : 0x40000000
> > mmap align: 0x200000
> > mmap off : 0x0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> > VHOST_CONFIG: vring kick idx:2 file:29
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > Cannot set up DMA mapping, error 14
> > vhost.c: 541:spdk_vhost_dev_mem_register: *WARNING*: Failed to register
> > memory region 0. Future vtophys translation might fail.
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring call idx:0 file:30
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring call idx:1 file:25
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring call idx:2 file:26
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_VRING_BASE
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring base idx:2 file:259
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > VHOST_CONFIG: guest memory region 0, size: 0x40000000
> > guest physical addr: 0x0
> > guest virtual addr: 0x7f4754600000
> > host virtual addr: 0x7fdce8000000
> > mmap addr : 0x7fdce8000000
> > mmap size : 0x40000000
> > mmap align: 0x200000
> > mmap off : 0x0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> > VHOST_CONFIG: vring kick idx:0 file:27
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring kick idx:1 file:28
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring kick idx:2 file:29
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring call idx:0 file:31
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring call idx:1 file:30
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring call idx:2 file:25
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7fdd1c467000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1511:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7fdd1c467000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1511:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > ==================================================================
> >
> > Thanks,
> > Adam Chang.
> >
> > On Thu, Aug 9, 2018 at 9:55 PM Stojaczyk, DariuszX
> > <dariuszx.stojaczyk(a)intel.com <mailto:dariuszx.stojaczyk(a)intel.com> >
> wrote:
> >
> >
> > Thanks,
> > The address that vtophys fails on should be mapped. Something went
> > wrong, but that vhost log is not particularly helpful because it comes
> from a non-
> > debug app.
> > I could ask you to enable debug (./configure --enable-debug), but
> could
> > you frst provide the dmesg? Are there any errors?
> > D.
> >
> > > -----Original Message-----
> > > From: SPDK [mailto:spdk-bounces(a)lists.01.org <mailto:spdk-
> > bounces(a)lists.01.org> ] On Behalf Of Adam Chang
> > > Sent: Thursday, August 9, 2018 12:56 PM
> > > To: Storage Performance Development Kit <spdk(a)lists.01.org
> > <mailto:spdk(a)lists.01.org> >
> > > Subject: Re: [SPDK] Error when issue IO in QEMU to vhost scsi
> NVMe
> > >
> > > Hi:
> > > I have added "-numa node,memdev=mem0" in QEMU command line,
> > but still had
> > > same error message.
> > > Here are my modified QEMU command argument
> > >
> > >
> > >
> > ==================================================================
> > > taskset -c 2,3,4,5 qemu-system-x86_64 -enable-kvm -m 1G \
> > > -name bread,debug-threads=on \
> > > -daemonize \
> > > -pidfile /var/log/bread.pid \
> > > -cpu host\
> > > -smp 4,sockets=1,cores=4,threads=1 \
> > > -object memory-backend-file,id=mem0,size=1G,mem-
> > >
> path=/dev/hugepages,share=on,prealloc=yes,host-nodes=0,policy=bind
> > -numa
> > > node,memdev=mem0\
> > > -drive
> > >
> > file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
> > > -chardev socket,id=char0,path=/var/tmp/vhost.0 \
> > > -device vhost-user-scsi-pci,id=scsi0,chardev=char0\
> > > -machine usb=on \
> > > -device usb-tablet \
> > > -device usb-mouse \
> > > -device usb-kbd \
> > > -vnc :2 \
> > > -net nic,model=virtio\
> > > -net user,hostfwd=tcp::2222-:22
> > >
> > ==================================================================
> > >
> > > And the following is the vhost log from QEMU starting:
> > >
> > ==================================================================
> > > VHOST_CONFIG: new vhost user connection is 18
> > > VHOST_CONFIG: new device, handle is 0
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_FEATURES
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_GET_PROTOCOL_FEATURES
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_PROTOCOL_FEATURES
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_GET_QUEUE_NUM
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_OWNER
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_FEATURES
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_CALL
> > > VHOST_CONFIG: vring call idx:0 file:25
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_CALL
> > > VHOST_CONFIG: vring call idx:1 file:26
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_CALL
> > > VHOST_CONFIG: vring call idx:2 file:27
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_FEATURES
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_NUM
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_BASE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_ADDR
> > > VHOST_CONFIG: guest memory region 0, size: 0x40000000
> > > guest physical addr: 0x0
> > > guest virtual addr: 0x7fa1a4a00000
> > > host virtual addr: 0x7f8fb4000000
> > > mmap addr : 0x7f8fb4000000
> > > mmap size : 0x40000000
> > > mmap align: 0x200000
> > > mmap off : 0x0
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_KICK
> > > VHOST_CONFIG: vring kick idx:2 file:29
> > > VHOST_CONFIG: virtio is now ready for processing.
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_CALL
> > > VHOST_CONFIG: vring call idx:0 file:30
> > > VHOST_CONFIG: virtio is now ready for processing.
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_CALL
> > > VHOST_CONFIG: vring call idx:1 file:25
> > > VHOST_CONFIG: virtio is now ready for processing.
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_CALL
> > > VHOST_CONFIG: vring call idx:2 file:26
> > > VHOST_CONFIG: virtio is now ready for processing.
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_GET_VRING_BASE
> > > VHOST_CONFIG: vring base idx:2 file:259
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_FEATURES
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_MEM_TABLE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_NUM
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_BASE
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_ADDR
> > > VHOST_CONFIG: guest memory region 0, size: 0x40000000
> > > guest physical addr: 0x0
> > > guest virtual addr: 0x7fa1a4a00000
> > > host virtual addr: 0x7f8fb4000000
> > > mmap addr : 0x7f8fb4000000
> > > mmap size : 0x40000000
> > > mmap align: 0x200000
> > > mmap off : 0x0
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_KICK
> > > VHOST_CONFIG: vring kick idx:0 file:27
> > > VHOST_CONFIG: virtio is now ready for processing.
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_NUM
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_BASE
> > > VHOST_CONFIG: virtio is now ready for processing.
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_ADDR
> > > VHOST_CONFIG: virtio is now ready for processing.
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_KICK
> > > VHOST_CONFIG: vring kick idx:1 file:28
> > > VHOST_CONFIG: virtio is now ready for processing.
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_NUM
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_BASE
> > > VHOST_CONFIG: virtio is now ready for processing.
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_ADDR
> > > VHOST_CONFIG: virtio is now ready for processing.
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_KICK
> > > VHOST_CONFIG: vring kick idx:2 file:29
> > > VHOST_CONFIG: virtio is now ready for processing.
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_CALL
> > > VHOST_CONFIG: vring call idx:0 file:31
> > > VHOST_CONFIG: virtio is now ready for processing.
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_CALL
> > > VHOST_CONFIG: vring call idx:1 file:30
> > > VHOST_CONFIG: virtio is now ready for processing.
> > > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > > VHOST_USER_SET_VRING_CALL
> > > VHOST_CONFIG: vring call idx:2 file:25
> > > VHOST_CONFIG: virtio is now ready for processing.
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc8000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:24 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc8000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:24 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc8000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:24 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc8000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:24 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc8000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:24 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc8000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:24 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fe7fc9000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > s
[-- Attachment #2: attachment.html --]
[-- Type: text/html, Size: 84222 bytes --]
^ permalink raw reply [flat|nested] 9+ messages in thread
* Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
@ 2018-08-10 5:14 Stojaczyk, DariuszX
0 siblings, 0 replies; 9+ messages in thread
From: Stojaczyk, DariuszX @ 2018-08-10 5:14 UTC (permalink / raw)
To: spdk
[-- Attachment #1: Type: text/plain, Size: 91244 bytes --]
> [ 4640.033876] DMAR: intel_iommu_map: iommu width (39) is not sufficient for
> the mapped address (7fdce8200000)
Thanks. It is SPDK's fault. I already pushed a patch to fix it: https://review.gerrithub.io/c/spdk/spdk/+/421697
D.
> -----Original Message-----
> From: SPDK [mailto:spdk-bounces(a)lists.01.org] On Behalf Of Adam Chang
> Sent: Friday, August 10, 2018 7:02 AM
> To: Storage Performance Development Kit <spdk(a)lists.01.org>
> Subject: Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
>
> Hi:
> Here are My host environment
> ==================================================================
> Host OS: Ubuntu 18.04 x86_64
> Linux Kernel: 4.15.0-30
> CPU: Intel i7 8700K
> Memory: 32GB
> NVME SSD: Intel Optane Memory 32GB
> ==================================================================
> configuration for building QEMU:
> ==================================================================
> ./configure --prefix=/usr --target-list=x86_64-softmmu --enable-kvm --enable-
> debug --enable-debug-info --enable-modules --enable-linux-aio --enable-vnc --
> enable-trace-backends=log --enable-numa --disable-werror --disable-strip --with-
> sdlabi=2.0
> ==================================================================
>
> configuration for building SPDK:
> ==================================================================
> ./configure --enable-debug
>
> ==================================================================
>
> I checked the dmesg, it showed the following error
>
> ==================================================================
>
> [ 4640.033876] DMAR: intel_iommu_map: iommu width (39) is not sufficient for
> the mapped address (7fdce8200000)
>
> ==================================================================
>
>
> I add log option when start the vhost target as follow:
> ==================================================================
> ./app/vhost/vhost -S /var/tmp -m 0x3 -L vhost vhost_scsi &
> ==================================================================
>
>
> And here are my vhost log
> ==================================================================
> VHOST_CONFIG: new vhost user connection is 18
> VHOST_CONFIG: new device, handle is 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_GET_PROTOCOL_FEATURES
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_PROTOCOL_FEATURES
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_GET_QUEUE_NUM
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_OWNER
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> VHOST_CONFIG: vring call idx:0 file:25
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> VHOST_CONFIG: vring call idx:1 file:26
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> VHOST_CONFIG: vring call idx:2 file:27
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_NUM
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_BASE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_ADDR
> VHOST_CONFIG: guest memory region 0, size: 0x40000000
> guest physical addr: 0x0
> guest virtual addr: 0x7f4754600000
> host virtual addr: 0x7fdce8000000
> mmap addr : 0x7fdce8000000
> mmap size : 0x40000000
> mmap align: 0x200000
> mmap off : 0x0
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> VHOST_CONFIG: vring kick idx:2 file:29
> VHOST_CONFIG: virtio is now ready for processing.
> vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
> vtophys translation - 0x7fdce8000000 len:0x40000000
> Cannot set up DMA mapping, error 14
> vhost.c: 541:spdk_vhost_dev_mem_register: *WARNING*: Failed to register
> memory region 0. Future vtophys translation might fail.
> vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> controller vhost.0 on lcore 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for vhost
> controller vhost.0
> VHOST_CONFIG: vring call idx:0 file:30
> VHOST_CONFIG: virtio is now ready for processing.
> vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
> vtophys translation - 0x7fdce8000000 len:0x40000000
> vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> controller vhost.0 on lcore 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for vhost
> controller vhost.0
> VHOST_CONFIG: vring call idx:1 file:25
> VHOST_CONFIG: virtio is now ready for processing.
> vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
> vtophys translation - 0x7fdce8000000 len:0x40000000
> vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> controller vhost.0 on lcore 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for vhost
> controller vhost.0
> VHOST_CONFIG: vring call idx:2 file:26
> VHOST_CONFIG: virtio is now ready for processing.
> vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
> vtophys translation - 0x7fdce8000000 len:0x40000000
> vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> controller vhost.0 on lcore 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_GET_VRING_BASE
> vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for vhost
> controller vhost.0
> VHOST_CONFIG: vring base idx:2 file:259
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_NUM
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_BASE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_ADDR
> VHOST_CONFIG: guest memory region 0, size: 0x40000000
> guest physical addr: 0x0
> guest virtual addr: 0x7f4754600000
> host virtual addr: 0x7fdce8000000
> mmap addr : 0x7fdce8000000
> mmap size : 0x40000000
> mmap align: 0x200000
> mmap off : 0x0
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> VHOST_CONFIG: vring kick idx:0 file:27
> VHOST_CONFIG: virtio is now ready for processing.
> vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
> vtophys translation - 0x7fdce8000000 len:0x40000000
> vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> controller vhost.0 on lcore 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_NUM
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_BASE
> vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for vhost
> controller vhost.0
> VHOST_CONFIG: virtio is now ready for processing.
> vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
> vtophys translation - 0x7fdce8000000 len:0x40000000
> vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> controller vhost.0 on lcore 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_ADDR
> vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for vhost
> controller vhost.0
> VHOST_CONFIG: virtio is now ready for processing.
> vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
> vtophys translation - 0x7fdce8000000 len:0x40000000
> vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> controller vhost.0 on lcore 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for vhost
> controller vhost.0
> VHOST_CONFIG: vring kick idx:1 file:28
> VHOST_CONFIG: virtio is now ready for processing.
> vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
> vtophys translation - 0x7fdce8000000 len:0x40000000
> vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> controller vhost.0 on lcore 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_NUM
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_BASE
> vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for vhost
> controller vhost.0
> VHOST_CONFIG: virtio is now ready for processing.
> vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
> vtophys translation - 0x7fdce8000000 len:0x40000000
> vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> controller vhost.0 on lcore 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_ADDR
> vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for vhost
> controller vhost.0
> VHOST_CONFIG: virtio is now ready for processing.
> vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
> vtophys translation - 0x7fdce8000000 len:0x40000000
> vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> controller vhost.0 on lcore 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for vhost
> controller vhost.0
> VHOST_CONFIG: vring kick idx:2 file:29
> VHOST_CONFIG: virtio is now ready for processing.
> vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
> vtophys translation - 0x7fdce8000000 len:0x40000000
> vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> controller vhost.0 on lcore 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for vhost
> controller vhost.0
> VHOST_CONFIG: vring call idx:0 file:31
> VHOST_CONFIG: virtio is now ready for processing.
> vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
> vtophys translation - 0x7fdce8000000 len:0x40000000
> vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> controller vhost.0 on lcore 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for vhost
> controller vhost.0
> VHOST_CONFIG: vring call idx:1 file:30
> VHOST_CONFIG: virtio is now ready for processing.
> vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
> vtophys translation - 0x7fdce8000000 len:0x40000000
> vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> controller vhost.0 on lcore 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for vhost
> controller vhost.0
> VHOST_CONFIG: vring call idx:2 file:25
> VHOST_CONFIG: virtio is now ready for processing.
> vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
> vtophys translation - 0x7fdce8000000 len:0x40000000
> vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> controller vhost.0 on lcore 0
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7fdd1c467000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1511:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7fdd1c467000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1511:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> ==================================================================
>
> Thanks,
> Adam Chang.
>
> On Thu, Aug 9, 2018 at 9:55 PM Stojaczyk, DariuszX
> <dariuszx.stojaczyk(a)intel.com <mailto:dariuszx.stojaczyk(a)intel.com> > wrote:
>
>
> Thanks,
> The address that vtophys fails on should be mapped. Something went
> wrong, but that vhost log is not particularly helpful because it comes from a non-
> debug app.
> I could ask you to enable debug (./configure --enable-debug), but could
> you frst provide the dmesg? Are there any errors?
> D.
>
> > -----Original Message-----
> > From: SPDK [mailto:spdk-bounces(a)lists.01.org <mailto:spdk-
> bounces(a)lists.01.org> ] On Behalf Of Adam Chang
> > Sent: Thursday, August 9, 2018 12:56 PM
> > To: Storage Performance Development Kit <spdk(a)lists.01.org
> <mailto:spdk(a)lists.01.org> >
> > Subject: Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
> >
> > Hi:
> > I have added "-numa node,memdev=mem0" in QEMU command line,
> but still had
> > same error message.
> > Here are my modified QEMU command argument
> >
> >
> >
> ==================================================================
> > taskset -c 2,3,4,5 qemu-system-x86_64 -enable-kvm -m 1G \
> > -name bread,debug-threads=on \
> > -daemonize \
> > -pidfile /var/log/bread.pid \
> > -cpu host\
> > -smp 4,sockets=1,cores=4,threads=1 \
> > -object memory-backend-file,id=mem0,size=1G,mem-
> > path=/dev/hugepages,share=on,prealloc=yes,host-nodes=0,policy=bind
> -numa
> > node,memdev=mem0\
> > -drive
> >
> file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
> > -chardev socket,id=char0,path=/var/tmp/vhost.0 \
> > -device vhost-user-scsi-pci,id=scsi0,chardev=char0\
> > -machine usb=on \
> > -device usb-tablet \
> > -device usb-mouse \
> > -device usb-kbd \
> > -vnc :2 \
> > -net nic,model=virtio\
> > -net user,hostfwd=tcp::2222-:22
> >
> ==================================================================
> >
> > And the following is the vhost log from QEMU starting:
> >
> ==================================================================
> > VHOST_CONFIG: new vhost user connection is 18
> > VHOST_CONFIG: new device, handle is 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_GET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_PROTOCOL_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_PROTOCOL_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_QUEUE_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_OWNER
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_GET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:0 file:25
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:1 file:26
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:2 file:27
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > VHOST_CONFIG: guest memory region 0, size: 0x40000000
> > guest physical addr: 0x0
> > guest virtual addr: 0x7fa1a4a00000
> > host virtual addr: 0x7f8fb4000000
> > mmap addr : 0x7f8fb4000000
> > mmap size : 0x40000000
> > mmap align: 0x200000
> > mmap off : 0x0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_KICK
> > VHOST_CONFIG: vring kick idx:2 file:29
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:0 file:30
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:1 file:25
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:2 file:26
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_VRING_BASE
> > VHOST_CONFIG: vring base idx:2 file:259
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > VHOST_CONFIG: guest memory region 0, size: 0x40000000
> > guest physical addr: 0x0
> > guest virtual addr: 0x7fa1a4a00000
> > host virtual addr: 0x7f8fb4000000
> > mmap addr : 0x7f8fb4000000
> > mmap size : 0x40000000
> > mmap align: 0x200000
> > mmap off : 0x0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_KICK
> > VHOST_CONFIG: vring kick idx:0 file:27
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_KICK
> > VHOST_CONFIG: vring kick idx:1 file:28
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_KICK
> > VHOST_CONFIG: vring kick idx:2 file:29
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:0 file:31
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:1 file:30
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:2 file:25
> > VHOST_CONFIG: virtio is now ready for processing.
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc8000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:24 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc8000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:24 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc8000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:24 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc8000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:24 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc8000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:24 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc8000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:24 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> >
> >
> ==================================================================
> >
> > Thanks,
> > Adam Chang.
> >
> > On Thu, Aug 9, 2018 at 4:07 PM Stojaczyk, DariuszX
> > <dariuszx.stojaczyk(a)intel.com <mailto:dariuszx.stojaczyk(a)intel.com>
> <mailto:dariuszx.stojaczyk(a)intel.com <mailto:dariuszx.stojaczyk(a)intel.com> > >
> wrote:
> >
> >
> > Can you provide a full vhost log?
> > D.
> >
> > > -----Original Message-----
> > > From: SPDK [mailto:spdk-bounces(a)lists.01.org <mailto:spdk-
> bounces(a)lists.01.org> <mailto:spdk- <mailto:spdk->
> > bounces(a)lists.01.org <mailto:bounces(a)lists.01.org> > ] On Behalf Of
> Adam Chang
> > > Sent: Thursday, August 9, 2018 4:05 AM
> > > To: spdk(a)lists.01.org <mailto:spdk(a)lists.01.org>
> <mailto:spdk(a)lists.01.org <mailto:spdk(a)lists.01.org> >
> > > Subject: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
> > >
> > > Hi all:
> > > I just create NVMe bdev and vhost-scsi controller which can be
> > accessed by
> > > QEMU, but it occurred error when IO issued from VM.
> > > Here are my steps for SPDK configuration
> > >
> > > Host OS:Ubuntu 18.04, Kernel 4.15.0-30
> > > Guest OS: Ubuntu 18.04
> > > QEMU: 2.12.0
> > > SPDK: v18.07
> > >
> > > 1) sudo HUGEMEM=4096 scripts/setup.sh
> > >
> > > 0000:05:00.0 (8086 2522): nvme -> vfio-pci
> > >
> > > Current user memlock limit: 4116 MB
> > >
> > > This is the maximum amount of memory you will be
> > > able to use with DPDK and VFIO if run as current user.
> > > To change this, please adjust limits.conf memlock limit for current
> user.
> > >
> > > 2) sudo ./app/vhost/vhost -S /var/tmp -m 0x3 &
> > >
> > > [ DPDK EAL parameters: vhost -c 0x3 -m 1024 --legacy-mem --file-
> > > prefix=spdk_pid1921 ]
> > > EAL: Detected 12 lcore(s)
> > > EAL: Detected 1 NUMA nodes
> > > EAL: Multi-process socket
> /var/run/dpdk/spdk_pid1921/mp_socket
> > > EAL: No free hugepages reported in hugepages-1048576kB
> > > EAL: Probing VFIO support...
> > > EAL: VFIO support initialized
> > > app.c: 530:spdk_app_start: *NOTICE*: Total cores available: 2
> > > reactor.c: 718:spdk_reactors_init: *NOTICE*: Occupied cpu socket
> mask
> > is 0x1
> > > reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on
> core 1
> > on socket
> > > 0
> > > reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on
> core 0
> > on socket
> > > 0
> > >
> > > 3) sudo ./scripts/rpc.py construct_vhost_scsi_controller --cpumask
> 0x1
> > vhost.0
> > > EAL: PCI device 0000:05:00.0 on NUMA socket 0
> > > EAL: probe driver: 8086:2522 spdk_nvme
> > > EAL: using IOMMU type 1 (Type 1)
> > > Nvme0n1
> > >
> > > 4) sudo ./scripts/rpc.py add_vhost_scsi_lun vhost.0 0 Nvme0n1
> > > 5) start qemu:
> > > taskset qemu-system-x86_64 -enable-kvm -m 1G \
> > > -name bread,debug-threads=on \
> > > -daemonize \
> > > -pidfile /var/log/bread.pid \
> > > -cpu host\
> > > -smp 4,sockets=1,cores=4,threads=1 \
> > > -object memory-backend-file,id=mem0,size=1G,mem-
> > > path=/dev/hugepages,share=on -numa node,memdev=mem0\
> > > -drive
> > >
> >
> file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
> > > -chardev socket,id=char0,path=/var/tmp/vhost.0 \
> > > -device vhost-user-scsi-pci,id=scsi0,chardev=char0\
> > > -machine usb=on \
> > > -device usb-tablet \
> > > -device usb-mouse \
> > > -device usb-kbd \
> > > -vnc :2 \
> > > -net nic,model=virtio\
> > > -net user,hostfwd=tcp::2222-:22
> > >
> > > then when I use fio to test the vhost nvme disk in guest VM, I got
> the
> > following
> > > error message in host console.
> > >
> >
> ==================================================================
> > > =========
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*:
> READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:32
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*:
> INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv
> failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*:
> READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:32
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*:
> INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv
> failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*:
> READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:32
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*:
> INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv
> failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*:
> READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:32
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*:
> INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv
> failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*:
> READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:32
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*:
> INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv
> failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*:
> READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:32
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*:
> INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv
> failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*:
> READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*:
> INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv
> failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*:
> READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*:
> INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv
> failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*:
> READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*:
> INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv
> failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*:
> READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*:
> INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv
> failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*:
> READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*:
> INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv
> failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*:
> READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*:
> INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv
> failed: rc
> > = -22
> > >
> >
> ==================================================================
> > > =========
> > >
> > > I used the lsblk to check block device information in guest, and
> could
> > see the
> > > nvme disk with sdb.
> > > >lsblk --output
> > "NAME,KNAME,MODEL,HCTL,SIZE,VENDOR,SUBSYSTEMS"
> > >
> >
> ==================================================================
> > > =========
> > >
> > > NAME KNAME MODEL HCTL SIZE VENDOR
> SUBSYSTEMS
> > > fd0 fd0 4K block:platform
> > > loop0 loop0 12.2M block
> > > loop1 loop1 86.6M block
> > > loop2 loop2 1.6M block
> > > loop3 loop3 3.3M block
> > > loop4 loop4 21M block
> > > loop5 loop5 2.3M block
> > > loop6 loop6 13M block
> > > loop7 loop7 3.7M block
> > > loop8 loop8 2.3M block
> > > loop9 loop9 86.9M block
> > > loop10 loop10 34.7M block
> > > loop11 loop11 87M block
> > > loop12 loop12 140.9M block
> > > loop13 loop13 13M block
> > > loop14 loop14 140M block
> > > loop15 loop15 139.5M block
> > > loop16 loop16 3.7M block
> > > loop17 loop17 14.5M block
> > > sda sda QEMU HARDDISK 0:0:0:0 32G ATA
> block:scsi:pci
> > > sda1 sda1 32G block:scsi:pci
> > > sdb sdb NVMe disk 2:0:0:0 27.3G INTEL
> block:scsi:virtio:pci
> > > sr0 sr0 QEMU DVD-ROM 1:0:0:0 1024M QEMU
> block:scsi:pci
> > >
> >
> ==================================================================
> > > =========
> > >
> > >
> > > Does anyone can give me help how to solve this problem ?
> > >
> > > Thanks.
> > > Adam Chang
> > _______________________________________________
> > SPDK mailing list
> > SPDK(a)lists.01.org <mailto:SPDK(a)lists.01.org>
> <mailto:SPDK(a)lists.01.org <mailto:SPDK(a)lists.01.org> >
> > https://lists.01.org/mailman/listinfo/spdk
> >
>
> _______________________________________________
> SPDK mailing list
> SPDK(a)lists.01.org <mailto:SPDK(a)lists.01.org>
> https://lists.01.org/mailman/listinfo/spdk
>
^ permalink raw reply [flat|nested] 9+ messages in thread
* Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
@ 2018-08-10 5:01 Adam Chang
0 siblings, 0 replies; 9+ messages in thread
From: Adam Chang @ 2018-08-10 5:01 UTC (permalink / raw)
To: spdk
[-- Attachment #1: Type: text/plain, Size: 86824 bytes --]
Hi:
Here are My host environment
==================================================================
Host OS: Ubuntu 18.04 x86_64
Linux Kernel: 4.15.0-30
CPU: Intel i7 8700K
Memory: 32GB
NVME SSD: Intel Optane Memory 32GB
==================================================================
configuration for building QEMU:
==================================================================
./configure --prefix=/usr --target-list=x86_64-softmmu --enable-kvm
--enable-debug --enable-debug-info --enable-modules --enable-linux-aio
--enable-vnc --enable-trace-backends=log --enable-numa --disable-werror
--disable-strip --with-sdlabi=2.0
==================================================================
configuration for building SPDK:
==================================================================
./configure --enable-debug
==================================================================
I checked the dmesg, it showed the following error
==================================================================
[ 4640.033876] DMAR: intel_iommu_map: iommu width (39) is not sufficient
for the mapped address (7fdce8200000)
==================================================================
I add log option when start the vhost target as follow:
==================================================================
./app/vhost/vhost -S /var/tmp -m 0x3 -L vhost vhost_scsi &
==================================================================
And here are my vhost log
==================================================================
VHOST_CONFIG: new vhost user connection is 18
VHOST_CONFIG: new device, handle is 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message
VHOST_USER_GET_PROTOCOL_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message
VHOST_USER_SET_PROTOCOL_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_QUEUE_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_OWNER
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:0 file:25
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:1 file:26
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:2 file:27
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_BASE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_ADDR
VHOST_CONFIG: guest memory region 0, size: 0x40000000
guest physical addr: 0x0
guest virtual addr: 0x7f4754600000
host virtual addr: 0x7fdce8000000
mmap addr : 0x7fdce8000000
mmap size : 0x40000000
mmap align: 0x200000
mmap off : 0x0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
VHOST_CONFIG: vring kick idx:2 file:29
VHOST_CONFIG: virtio is now ready for processing.
vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
vtophys translation - 0x7fdce8000000 len:0x40000000
Cannot set up DMA mapping, error 14
vhost.c: 541:spdk_vhost_dev_mem_register: *WARNING*: Failed to register
memory region 0. Future vtophys translation might fail.
vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
controller vhost.0 on lcore 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
vhost controller vhost.0
VHOST_CONFIG: vring call idx:0 file:30
VHOST_CONFIG: virtio is now ready for processing.
vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
vtophys translation - 0x7fdce8000000 len:0x40000000
vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
controller vhost.0 on lcore 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
vhost controller vhost.0
VHOST_CONFIG: vring call idx:1 file:25
VHOST_CONFIG: virtio is now ready for processing.
vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
vtophys translation - 0x7fdce8000000 len:0x40000000
vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
controller vhost.0 on lcore 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
vhost controller vhost.0
VHOST_CONFIG: vring call idx:2 file:26
VHOST_CONFIG: virtio is now ready for processing.
vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
vtophys translation - 0x7fdce8000000 len:0x40000000
vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
controller vhost.0 on lcore 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_VRING_BASE
vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
vhost controller vhost.0
VHOST_CONFIG: vring base idx:2 file:259
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_BASE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_ADDR
VHOST_CONFIG: guest memory region 0, size: 0x40000000
guest physical addr: 0x0
guest virtual addr: 0x7f4754600000
host virtual addr: 0x7fdce8000000
mmap addr : 0x7fdce8000000
mmap size : 0x40000000
mmap align: 0x200000
mmap off : 0x0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
VHOST_CONFIG: vring kick idx:0 file:27
VHOST_CONFIG: virtio is now ready for processing.
vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
vtophys translation - 0x7fdce8000000 len:0x40000000
vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
controller vhost.0 on lcore 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_BASE
vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
vhost controller vhost.0
VHOST_CONFIG: virtio is now ready for processing.
vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
vtophys translation - 0x7fdce8000000 len:0x40000000
vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
controller vhost.0 on lcore 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_ADDR
vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
vhost controller vhost.0
VHOST_CONFIG: virtio is now ready for processing.
vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
vtophys translation - 0x7fdce8000000 len:0x40000000
vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
controller vhost.0 on lcore 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
vhost controller vhost.0
VHOST_CONFIG: vring kick idx:1 file:28
VHOST_CONFIG: virtio is now ready for processing.
vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
vtophys translation - 0x7fdce8000000 len:0x40000000
vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
controller vhost.0 on lcore 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_BASE
vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
vhost controller vhost.0
VHOST_CONFIG: virtio is now ready for processing.
vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
vtophys translation - 0x7fdce8000000 len:0x40000000
vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
controller vhost.0 on lcore 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_ADDR
vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
vhost controller vhost.0
VHOST_CONFIG: virtio is now ready for processing.
vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
vtophys translation - 0x7fdce8000000 len:0x40000000
vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
controller vhost.0 on lcore 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
vhost controller vhost.0
VHOST_CONFIG: vring kick idx:2 file:29
VHOST_CONFIG: virtio is now ready for processing.
vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
vtophys translation - 0x7fdce8000000 len:0x40000000
vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
controller vhost.0 on lcore 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
vhost controller vhost.0
VHOST_CONFIG: vring call idx:0 file:31
VHOST_CONFIG: virtio is now ready for processing.
vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
vtophys translation - 0x7fdce8000000 len:0x40000000
vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
controller vhost.0 on lcore 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
vhost controller vhost.0
VHOST_CONFIG: vring call idx:1 file:30
VHOST_CONFIG: virtio is now ready for processing.
vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
vtophys translation - 0x7fdce8000000 len:0x40000000
vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
controller vhost.0 on lcore 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
vhost controller vhost.0
VHOST_CONFIG: vring call idx:2 file:25
VHOST_CONFIG: virtio is now ready for processing.
vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory for
vtophys translation - 0x7fdce8000000 len:0x40000000
vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
controller vhost.0 on lcore 0
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7fdd1c467000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1511:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7fdd1c467000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1511:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
==================================================================
Thanks,
Adam Chang.
On Thu, Aug 9, 2018 at 9:55 PM Stojaczyk, DariuszX <
dariuszx.stojaczyk(a)intel.com> wrote:
> Thanks,
> The address that vtophys fails on should be mapped. Something went wrong,
> but that vhost log is not particularly helpful because it comes from a
> non-debug app.
> I could ask you to enable debug (./configure --enable-debug), but could
> you frst provide the dmesg? Are there any errors?
> D.
>
> > -----Original Message-----
> > From: SPDK [mailto:spdk-bounces(a)lists.01.org] On Behalf Of Adam Chang
> > Sent: Thursday, August 9, 2018 12:56 PM
> > To: Storage Performance Development Kit <spdk(a)lists.01.org>
> > Subject: Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
> >
> > Hi:
> > I have added "-numa node,memdev=mem0" in QEMU command line, but still had
> > same error message.
> > Here are my modified QEMU command argument
> >
> >
> > ==================================================================
> > taskset -c 2,3,4,5 qemu-system-x86_64 -enable-kvm -m 1G \
> > -name bread,debug-threads=on \
> > -daemonize \
> > -pidfile /var/log/bread.pid \
> > -cpu host\
> > -smp 4,sockets=1,cores=4,threads=1 \
> > -object memory-backend-file,id=mem0,size=1G,mem-
> > path=/dev/hugepages,share=on,prealloc=yes,host-nodes=0,policy=bind -numa
> > node,memdev=mem0\
> > -drive
> > file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
> > -chardev socket,id=char0,path=/var/tmp/vhost.0 \
> > -device vhost-user-scsi-pci,id=scsi0,chardev=char0\
> > -machine usb=on \
> > -device usb-tablet \
> > -device usb-mouse \
> > -device usb-kbd \
> > -vnc :2 \
> > -net nic,model=virtio\
> > -net user,hostfwd=tcp::2222-:22
> > ==================================================================
> >
> > And the following is the vhost log from QEMU starting:
> > ==================================================================
> > VHOST_CONFIG: new vhost user connection is 18
> > VHOST_CONFIG: new device, handle is 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_PROTOCOL_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_PROTOCOL_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_QUEUE_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_OWNER
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:0 file:25
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:1 file:26
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:2 file:27
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > VHOST_CONFIG: guest memory region 0, size: 0x40000000
> > guest physical addr: 0x0
> > guest virtual addr: 0x7fa1a4a00000
> > host virtual addr: 0x7f8fb4000000
> > mmap addr : 0x7f8fb4000000
> > mmap size : 0x40000000
> > mmap align: 0x200000
> > mmap off : 0x0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> > VHOST_CONFIG: vring kick idx:2 file:29
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:0 file:30
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:1 file:25
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:2 file:26
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_VRING_BASE
> > VHOST_CONFIG: vring base idx:2 file:259
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > VHOST_CONFIG: guest memory region 0, size: 0x40000000
> > guest physical addr: 0x0
> > guest virtual addr: 0x7fa1a4a00000
> > host virtual addr: 0x7f8fb4000000
> > mmap addr : 0x7f8fb4000000
> > mmap size : 0x40000000
> > mmap align: 0x200000
> > mmap off : 0x0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> > VHOST_CONFIG: vring kick idx:0 file:27
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> > VHOST_CONFIG: vring kick idx:1 file:28
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> > VHOST_CONFIG: vring kick idx:2 file:29
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:0 file:31
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:1 file:30
> > VHOST_CONFIG: virtio is now ready for processing.
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:2 file:25
> > VHOST_CONFIG: virtio is now ready for processing.
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc8000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:24 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc8000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:24 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc8000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:24 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc8000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:24 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc8000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:24 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc8000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:24 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7fc9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe7f2a000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fe98a9000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:57149312 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> >
> > ==================================================================
> >
> > Thanks,
> > Adam Chang.
> >
> > On Thu, Aug 9, 2018 at 4:07 PM Stojaczyk, DariuszX
> > <dariuszx.stojaczyk(a)intel.com <mailto:dariuszx.stojaczyk(a)intel.com> >
> wrote:
> >
> >
> > Can you provide a full vhost log?
> > D.
> >
> > > -----Original Message-----
> > > From: SPDK [mailto:spdk-bounces(a)lists.01.org <mailto:spdk-
> > bounces(a)lists.01.org> ] On Behalf Of Adam Chang
> > > Sent: Thursday, August 9, 2018 4:05 AM
> > > To: spdk(a)lists.01.org <mailto:spdk(a)lists.01.org>
> > > Subject: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
> > >
> > > Hi all:
> > > I just create NVMe bdev and vhost-scsi controller which
> can be
> > accessed by
> > > QEMU, but it occurred error when IO issued from VM.
> > > Here are my steps for SPDK configuration
> > >
> > > Host OS:Ubuntu 18.04, Kernel 4.15.0-30
> > > Guest OS: Ubuntu 18.04
> > > QEMU: 2.12.0
> > > SPDK: v18.07
> > >
> > > 1) sudo HUGEMEM=4096 scripts/setup.sh
> > >
> > > 0000:05:00.0 (8086 2522): nvme -> vfio-pci
> > >
> > > Current user memlock limit: 4116 MB
> > >
> > > This is the maximum amount of memory you will be
> > > able to use with DPDK and VFIO if run as current user.
> > > To change this, please adjust limits.conf memlock limit for
> current user.
> > >
> > > 2) sudo ./app/vhost/vhost -S /var/tmp -m 0x3 &
> > >
> > > [ DPDK EAL parameters: vhost -c 0x3 -m 1024 --legacy-mem --file-
> > > prefix=spdk_pid1921 ]
> > > EAL: Detected 12 lcore(s)
> > > EAL: Detected 1 NUMA nodes
> > > EAL: Multi-process socket /var/run/dpdk/spdk_pid1921/mp_socket
> > > EAL: No free hugepages reported in hugepages-1048576kB
> > > EAL: Probing VFIO support...
> > > EAL: VFIO support initialized
> > > app.c: 530:spdk_app_start: *NOTICE*: Total cores available: 2
> > > reactor.c: 718:spdk_reactors_init: *NOTICE*: Occupied cpu socket
> mask
> > is 0x1
> > > reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on
> core 1
> > on socket
> > > 0
> > > reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on
> core 0
> > on socket
> > > 0
> > >
> > > 3) sudo ./scripts/rpc.py construct_vhost_scsi_controller
> --cpumask 0x1
> > vhost.0
> > > EAL: PCI device 0000:05:00.0 on NUMA socket 0
> > > EAL: probe driver: 8086:2522 spdk_nvme
> > > EAL: using IOMMU type 1 (Type 1)
> > > Nvme0n1
> > >
> > > 4) sudo ./scripts/rpc.py add_vhost_scsi_lun vhost.0 0 Nvme0n1
> > > 5) start qemu:
> > > taskset qemu-system-x86_64 -enable-kvm -m 1G \
> > > -name bread,debug-threads=on \
> > > -daemonize \
> > > -pidfile /var/log/bread.pid \
> > > -cpu host\
> > > -smp 4,sockets=1,cores=4,threads=1 \
> > > -object memory-backend-file,id=mem0,size=1G,mem-
> > > path=/dev/hugepages,share=on -numa node,memdev=mem0\
> > > -drive
> > >
> > file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
> > > -chardev socket,id=char0,path=/var/tmp/vhost.0 \
> > > -device vhost-user-scsi-pci,id=scsi0,chardev=char0\
> > > -machine usb=on \
> > > -device usb-tablet \
> > > -device usb-mouse \
> > > -device usb-kbd \
> > > -vnc :2 \
> > > -net nic,model=virtio\
> > > -net user,hostfwd=tcp::2222-:22
> > >
> > > then when I use fio to test the vhost nvme disk in guest VM, I
> got the
> > following
> > > error message in host console.
> > >
> > ==================================================================
> > > =========
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:32
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:32
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:32
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:32
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:32
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:32
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > > vtophys(0x7f8fed64d000) failed
> > > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> > > cid:95 nsid:1 lba:0 len:8
> > > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> > > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> > >
> > ==================================================================
> > > =========
> > >
> > > I used the lsblk to check block device information in guest,
> and could
> > see the
> > > nvme disk with sdb.
> > > >lsblk --output
> > "NAME,KNAME,MODEL,HCTL,SIZE,VENDOR,SUBSYSTEMS"
> > >
> > ==================================================================
> > > =========
> > >
> > > NAME KNAME MODEL HCTL SIZE VENDOR
> SUBSYSTEMS
> > > fd0 fd0 4K
> block:platform
> > > loop0 loop0 12.2M block
> > > loop1 loop1 86.6M block
> > > loop2 loop2 1.6M block
> > > loop3 loop3 3.3M block
> > > loop4 loop4 21M block
> > > loop5 loop5 2.3M block
> > > loop6 loop6 13M block
> > > loop7 loop7 3.7M block
> > > loop8 loop8 2.3M block
> > > loop9 loop9 86.9M block
> > > loop10 loop10 34.7M block
> > > loop11 loop11 87M block
> > > loop12 loop12 140.9M block
> > > loop13 loop13 13M block
> > > loop14 loop14 140M block
> > > loop15 loop15 139.5M block
> > > loop16 loop16 3.7M block
> > > loop17 loop17 14.5M block
> > > sda sda QEMU HARDDISK 0:0:0:0 32G ATA
> block:scsi:pci
> > > sda1 sda1 32G
> block:scsi:pci
> > > sdb sdb NVMe disk 2:0:0:0 27.3G INTEL
> block:scsi:virtio:pci
> > > sr0 sr0 QEMU DVD-ROM 1:0:0:0 1024M QEMU
> block:scsi:pci
> > >
> > ==================================================================
> > > =========
> > >
> > >
> > > Does anyone can give me help how to solve this problem ?
> > >
> > > Thanks.
> > > Adam Chang
> > _______________________________________________
> > SPDK mailing list
> > SPDK(a)lists.01.org <mailto:SPDK(a)lists.01.org>
> > https://lists.01.org/mailman/listinfo/spdk
> >
>
> _______________________________________________
> SPDK mailing list
> SPDK(a)lists.01.org
> https://lists.01.org/mailman/listinfo/spdk
>
[-- Attachment #2: attachment.html --]
[-- Type: text/html, Size: 98516 bytes --]
^ permalink raw reply [flat|nested] 9+ messages in thread
* Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
@ 2018-08-09 13:55 Stojaczyk, DariuszX
0 siblings, 0 replies; 9+ messages in thread
From: Stojaczyk, DariuszX @ 2018-08-09 13:55 UTC (permalink / raw)
To: spdk
[-- Attachment #1: Type: text/plain, Size: 63710 bytes --]
Thanks,
The address that vtophys fails on should be mapped. Something went wrong, but that vhost log is not particularly helpful because it comes from a non-debug app.
I could ask you to enable debug (./configure --enable-debug), but could you frst provide the dmesg? Are there any errors?
D.
> -----Original Message-----
> From: SPDK [mailto:spdk-bounces(a)lists.01.org] On Behalf Of Adam Chang
> Sent: Thursday, August 9, 2018 12:56 PM
> To: Storage Performance Development Kit <spdk(a)lists.01.org>
> Subject: Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
>
> Hi:
> I have added "-numa node,memdev=mem0" in QEMU command line, but still had
> same error message.
> Here are my modified QEMU command argument
>
>
> ==================================================================
> taskset -c 2,3,4,5 qemu-system-x86_64 -enable-kvm -m 1G \
> -name bread,debug-threads=on \
> -daemonize \
> -pidfile /var/log/bread.pid \
> -cpu host\
> -smp 4,sockets=1,cores=4,threads=1 \
> -object memory-backend-file,id=mem0,size=1G,mem-
> path=/dev/hugepages,share=on,prealloc=yes,host-nodes=0,policy=bind -numa
> node,memdev=mem0\
> -drive
> file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
> -chardev socket,id=char0,path=/var/tmp/vhost.0 \
> -device vhost-user-scsi-pci,id=scsi0,chardev=char0\
> -machine usb=on \
> -device usb-tablet \
> -device usb-mouse \
> -device usb-kbd \
> -vnc :2 \
> -net nic,model=virtio\
> -net user,hostfwd=tcp::2222-:22
> ==================================================================
>
> And the following is the vhost log from QEMU starting:
> ==================================================================
> VHOST_CONFIG: new vhost user connection is 18
> VHOST_CONFIG: new device, handle is 0
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_GET_PROTOCOL_FEATURES
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_PROTOCOL_FEATURES
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_GET_QUEUE_NUM
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_OWNER
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> VHOST_CONFIG: vring call idx:0 file:25
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> VHOST_CONFIG: vring call idx:1 file:26
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> VHOST_CONFIG: vring call idx:2 file:27
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_NUM
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_BASE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_ADDR
> VHOST_CONFIG: guest memory region 0, size: 0x40000000
> guest physical addr: 0x0
> guest virtual addr: 0x7fa1a4a00000
> host virtual addr: 0x7f8fb4000000
> mmap addr : 0x7f8fb4000000
> mmap size : 0x40000000
> mmap align: 0x200000
> mmap off : 0x0
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> VHOST_CONFIG: vring kick idx:2 file:29
> VHOST_CONFIG: virtio is now ready for processing.
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> VHOST_CONFIG: vring call idx:0 file:30
> VHOST_CONFIG: virtio is now ready for processing.
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> VHOST_CONFIG: vring call idx:1 file:25
> VHOST_CONFIG: virtio is now ready for processing.
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> VHOST_CONFIG: vring call idx:2 file:26
> VHOST_CONFIG: virtio is now ready for processing.
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_GET_VRING_BASE
> VHOST_CONFIG: vring base idx:2 file:259
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_MEM_TABLE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_NUM
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_BASE
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_ADDR
> VHOST_CONFIG: guest memory region 0, size: 0x40000000
> guest physical addr: 0x0
> guest virtual addr: 0x7fa1a4a00000
> host virtual addr: 0x7f8fb4000000
> mmap addr : 0x7f8fb4000000
> mmap size : 0x40000000
> mmap align: 0x200000
> mmap off : 0x0
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> VHOST_CONFIG: vring kick idx:0 file:27
> VHOST_CONFIG: virtio is now ready for processing.
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_NUM
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_BASE
> VHOST_CONFIG: virtio is now ready for processing.
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_ADDR
> VHOST_CONFIG: virtio is now ready for processing.
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> VHOST_CONFIG: vring kick idx:1 file:28
> VHOST_CONFIG: virtio is now ready for processing.
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_NUM
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_BASE
> VHOST_CONFIG: virtio is now ready for processing.
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_ADDR
> VHOST_CONFIG: virtio is now ready for processing.
> VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> VHOST_CONFIG: vring kick idx:2 file:29
> VHOST_CONFIG: virtio is now ready for processing.
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> VHOST_CONFIG: vring call idx:0 file:31
> VHOST_CONFIG: virtio is now ready for processing.
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> VHOST_CONFIG: vring call idx:1 file:30
> VHOST_CONFIG: virtio is now ready for processing.
> VHOST_CONFIG: /var/tmp/vhost.0: read message
> VHOST_USER_SET_VRING_CALL
> VHOST_CONFIG: vring call idx:2 file:25
> VHOST_CONFIG: virtio is now ready for processing.
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc8000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:24 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc8000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:24 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc8000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:24 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc8000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:24 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc8000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:24 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc8000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:24 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7fc9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7f2a000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7f2a000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7f2a000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7f2a000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7f2a000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7f2a000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7f2a000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7f2a000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7f2a000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7f2a000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7f2a000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe7f2a000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe98a9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe98a9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe98a9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe98a9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe98a9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe98a9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe98a9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe98a9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe98a9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe98a9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe98a9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fe98a9000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:57149312 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
>
> ==================================================================
>
> Thanks,
> Adam Chang.
>
> On Thu, Aug 9, 2018 at 4:07 PM Stojaczyk, DariuszX
> <dariuszx.stojaczyk(a)intel.com <mailto:dariuszx.stojaczyk(a)intel.com> > wrote:
>
>
> Can you provide a full vhost log?
> D.
>
> > -----Original Message-----
> > From: SPDK [mailto:spdk-bounces(a)lists.01.org <mailto:spdk-
> bounces(a)lists.01.org> ] On Behalf Of Adam Chang
> > Sent: Thursday, August 9, 2018 4:05 AM
> > To: spdk(a)lists.01.org <mailto:spdk(a)lists.01.org>
> > Subject: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
> >
> > Hi all:
> > I just create NVMe bdev and vhost-scsi controller which can be
> accessed by
> > QEMU, but it occurred error when IO issued from VM.
> > Here are my steps for SPDK configuration
> >
> > Host OS:Ubuntu 18.04, Kernel 4.15.0-30
> > Guest OS: Ubuntu 18.04
> > QEMU: 2.12.0
> > SPDK: v18.07
> >
> > 1) sudo HUGEMEM=4096 scripts/setup.sh
> >
> > 0000:05:00.0 (8086 2522): nvme -> vfio-pci
> >
> > Current user memlock limit: 4116 MB
> >
> > This is the maximum amount of memory you will be
> > able to use with DPDK and VFIO if run as current user.
> > To change this, please adjust limits.conf memlock limit for current user.
> >
> > 2) sudo ./app/vhost/vhost -S /var/tmp -m 0x3 &
> >
> > [ DPDK EAL parameters: vhost -c 0x3 -m 1024 --legacy-mem --file-
> > prefix=spdk_pid1921 ]
> > EAL: Detected 12 lcore(s)
> > EAL: Detected 1 NUMA nodes
> > EAL: Multi-process socket /var/run/dpdk/spdk_pid1921/mp_socket
> > EAL: No free hugepages reported in hugepages-1048576kB
> > EAL: Probing VFIO support...
> > EAL: VFIO support initialized
> > app.c: 530:spdk_app_start: *NOTICE*: Total cores available: 2
> > reactor.c: 718:spdk_reactors_init: *NOTICE*: Occupied cpu socket mask
> is 0x1
> > reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on core 1
> on socket
> > 0
> > reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on core 0
> on socket
> > 0
> >
> > 3) sudo ./scripts/rpc.py construct_vhost_scsi_controller --cpumask 0x1
> vhost.0
> > EAL: PCI device 0000:05:00.0 on NUMA socket 0
> > EAL: probe driver: 8086:2522 spdk_nvme
> > EAL: using IOMMU type 1 (Type 1)
> > Nvme0n1
> >
> > 4) sudo ./scripts/rpc.py add_vhost_scsi_lun vhost.0 0 Nvme0n1
> > 5) start qemu:
> > taskset qemu-system-x86_64 -enable-kvm -m 1G \
> > -name bread,debug-threads=on \
> > -daemonize \
> > -pidfile /var/log/bread.pid \
> > -cpu host\
> > -smp 4,sockets=1,cores=4,threads=1 \
> > -object memory-backend-file,id=mem0,size=1G,mem-
> > path=/dev/hugepages,share=on -numa node,memdev=mem0\
> > -drive
> >
> file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
> > -chardev socket,id=char0,path=/var/tmp/vhost.0 \
> > -device vhost-user-scsi-pci,id=scsi0,chardev=char0\
> > -machine usb=on \
> > -device usb-tablet \
> > -device usb-mouse \
> > -device usb-kbd \
> > -vnc :2 \
> > -net nic,model=virtio\
> > -net user,hostfwd=tcp::2222-:22
> >
> > then when I use fio to test the vhost nvme disk in guest VM, I got the
> following
> > error message in host console.
> >
> ==================================================================
> > =========
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:32
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:32
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:32
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:32
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:32
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:32
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> = -22
> >
> ==================================================================
> > =========
> >
> > I used the lsblk to check block device information in guest, and could
> see the
> > nvme disk with sdb.
> > >lsblk --output
> "NAME,KNAME,MODEL,HCTL,SIZE,VENDOR,SUBSYSTEMS"
> >
> ==================================================================
> > =========
> >
> > NAME KNAME MODEL HCTL SIZE VENDOR SUBSYSTEMS
> > fd0 fd0 4K block:platform
> > loop0 loop0 12.2M block
> > loop1 loop1 86.6M block
> > loop2 loop2 1.6M block
> > loop3 loop3 3.3M block
> > loop4 loop4 21M block
> > loop5 loop5 2.3M block
> > loop6 loop6 13M block
> > loop7 loop7 3.7M block
> > loop8 loop8 2.3M block
> > loop9 loop9 86.9M block
> > loop10 loop10 34.7M block
> > loop11 loop11 87M block
> > loop12 loop12 140.9M block
> > loop13 loop13 13M block
> > loop14 loop14 140M block
> > loop15 loop15 139.5M block
> > loop16 loop16 3.7M block
> > loop17 loop17 14.5M block
> > sda sda QEMU HARDDISK 0:0:0:0 32G ATA block:scsi:pci
> > sda1 sda1 32G block:scsi:pci
> > sdb sdb NVMe disk 2:0:0:0 27.3G INTEL block:scsi:virtio:pci
> > sr0 sr0 QEMU DVD-ROM 1:0:0:0 1024M QEMU block:scsi:pci
> >
> ==================================================================
> > =========
> >
> >
> > Does anyone can give me help how to solve this problem ?
> >
> > Thanks.
> > Adam Chang
> _______________________________________________
> SPDK mailing list
> SPDK(a)lists.01.org <mailto:SPDK(a)lists.01.org>
> https://lists.01.org/mailman/listinfo/spdk
>
^ permalink raw reply [flat|nested] 9+ messages in thread
* Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
@ 2018-08-09 12:42 Wodkowski, PawelX
0 siblings, 0 replies; 9+ messages in thread
From: Wodkowski, PawelX @ 2018-08-09 12:42 UTC (permalink / raw)
To: spdk
[-- Attachment #1: Type: text/plain, Size: 60593 bytes --]
Nope, your qemu command work fine on my setup (although I’m using Ubuntu 17.10 as guest OS). Only one suspicious weird thing is the amount of VHOST_USER_SET_MEM_TABLE messages. I don’t know why QEMU is sending them.
Can you provide more details about your host OS, hardware, how you build SPKD and QEMU etc?
Pawel
From: SPDK [mailto:spdk-bounces(a)lists.01.org] On Behalf Of Adam Chang
Sent: Thursday, August 9, 2018 12:56 PM
To: Storage Performance Development <spdk(a)lists.01.org>
Subject: Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
Hi:
I have added "-numa node,memdev=mem0" in QEMU command line, but still had same error message.
Here are my modified QEMU command argument
==================================================================
taskset -c 2,3,4,5 qemu-system-x86_64 -enable-kvm -m 1G \
-name bread,debug-threads=on \
-daemonize \
-pidfile /var/log/bread.pid \
-cpu host\
-smp 4,sockets=1,cores=4,threads=1 \
-object memory-backend-file,id=mem0,size=1G,mem-path=/dev/hugepages,share=on,prealloc=yes,host-nodes=0,policy=bind -numa node,memdev=mem0\
-drive file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
-chardev socket,id=char0,path=/var/tmp/vhost.0 \
-device vhost-user-scsi-pci,id=scsi0,chardev=char0\
-machine usb=on \
-device usb-tablet \
-device usb-mouse \
-device usb-kbd \
-vnc :2 \
-net nic,model=virtio\
-net user,hostfwd=tcp::2222-:22
==================================================================
And the following is the vhost log from QEMU starting:
==================================================================
VHOST_CONFIG: new vhost user connection is 18
VHOST_CONFIG: new device, handle is 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_PROTOCOL_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_PROTOCOL_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_QUEUE_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_OWNER
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:0 file:25
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:1 file:26
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:2 file:27
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_BASE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_ADDR
VHOST_CONFIG: guest memory region 0, size: 0x40000000
guest physical addr: 0x0
guest virtual addr: 0x7fa1a4a00000
host virtual addr: 0x7f8fb4000000
mmap addr : 0x7f8fb4000000
mmap size : 0x40000000
mmap align: 0x200000
mmap off : 0x0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
VHOST_CONFIG: vring kick idx:2 file:29
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:0 file:30
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:1 file:25
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:2 file:26
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_VRING_BASE
VHOST_CONFIG: vring base idx:2 file:259
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_BASE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_ADDR
VHOST_CONFIG: guest memory region 0, size: 0x40000000
guest physical addr: 0x0
guest virtual addr: 0x7fa1a4a00000
host virtual addr: 0x7f8fb4000000
mmap addr : 0x7f8fb4000000
mmap size : 0x40000000
mmap align: 0x200000
mmap off : 0x0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
VHOST_CONFIG: vring kick idx:0 file:27
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_BASE
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_ADDR
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
VHOST_CONFIG: vring kick idx:1 file:28
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_BASE
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_ADDR
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
VHOST_CONFIG: vring kick idx:2 file:29
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:0 file:31
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:1 file:30
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:2 file:25
VHOST_CONFIG: virtio is now ready for processing.
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc8000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:24 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc8000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:24 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc8000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:24 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc8000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:24 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc8000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:24 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc8000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:24 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*: vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
==================================================================
Thanks,
Adam Chang.
On Thu, Aug 9, 2018 at 4:07 PM Stojaczyk, DariuszX <dariuszx.stojaczyk(a)intel.com<mailto:dariuszx.stojaczyk(a)intel.com>> wrote:
Can you provide a full vhost log?
D.
> -----Original Message-----
> From: SPDK [mailto:spdk-bounces(a)lists.01.org<mailto:spdk-bounces(a)lists.01.org>] On Behalf Of Adam Chang
> Sent: Thursday, August 9, 2018 4:05 AM
> To: spdk(a)lists.01.org<mailto:spdk(a)lists.01.org>
> Subject: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
>
> Hi all:
> I just create NVMe bdev and vhost-scsi controller which can be accessed by
> QEMU, but it occurred error when IO issued from VM.
> Here are my steps for SPDK configuration
>
> Host OS:Ubuntu 18.04, Kernel 4.15.0-30
> Guest OS: Ubuntu 18.04
> QEMU: 2.12.0
> SPDK: v18.07
>
> 1) sudo HUGEMEM=4096 scripts/setup.sh
>
> 0000:05:00.0 (8086 2522): nvme -> vfio-pci
>
> Current user memlock limit: 4116 MB
>
> This is the maximum amount of memory you will be
> able to use with DPDK and VFIO if run as current user.
> To change this, please adjust limits.conf memlock limit for current user.
>
> 2) sudo ./app/vhost/vhost -S /var/tmp -m 0x3 &
>
> [ DPDK EAL parameters: vhost -c 0x3 -m 1024 --legacy-mem --file-
> prefix=spdk_pid1921 ]
> EAL: Detected 12 lcore(s)
> EAL: Detected 1 NUMA nodes
> EAL: Multi-process socket /var/run/dpdk/spdk_pid1921/mp_socket
> EAL: No free hugepages reported in hugepages-1048576kB
> EAL: Probing VFIO support...
> EAL: VFIO support initialized
> app.c: 530:spdk_app_start: *NOTICE*: Total cores available: 2
> reactor.c: 718:spdk_reactors_init: *NOTICE*: Occupied cpu socket mask is 0x1
> reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on core 1 on socket
> 0
> reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on core 0 on socket
> 0
>
> 3) sudo ./scripts/rpc.py construct_vhost_scsi_controller --cpumask 0x1 vhost.0
> EAL: PCI device 0000:05:00.0 on NUMA socket 0
> EAL: probe driver: 8086:2522 spdk_nvme
> EAL: using IOMMU type 1 (Type 1)
> Nvme0n1
>
> 4) sudo ./scripts/rpc.py add_vhost_scsi_lun vhost.0 0 Nvme0n1
> 5) start qemu:
> taskset qemu-system-x86_64 -enable-kvm -m 1G \
> -name bread,debug-threads=on \
> -daemonize \
> -pidfile /var/log/bread.pid \
> -cpu host\
> -smp 4,sockets=1,cores=4,threads=1 \
> -object memory-backend-file,id=mem0,size=1G,mem-
> path=/dev/hugepages,share=on -numa node,memdev=mem0\
> -drive
> file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
> -chardev socket,id=char0,path=/var/tmp/vhost.0 \
> -device vhost-user-scsi-pci,id=scsi0,chardev=char0\
> -machine usb=on \
> -device usb-tablet \
> -device usb-mouse \
> -device usb-kbd \
> -vnc :2 \
> -net nic,model=virtio\
> -net user,hostfwd=tcp::2222-:22
>
> then when I use fio to test the vhost nvme disk in guest VM, I got the following
> error message in host console.
> ==================================================================
> =========
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:32
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:32
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:32
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:32
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:32
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:32
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> ==================================================================
> =========
>
> I used the lsblk to check block device information in guest, and could see the
> nvme disk with sdb.
> >lsblk --output "NAME,KNAME,MODEL,HCTL,SIZE,VENDOR,SUBSYSTEMS"
> ==================================================================
> =========
>
> NAME KNAME MODEL HCTL SIZE VENDOR SUBSYSTEMS
> fd0 fd0 4K block:platform
> loop0 loop0 12.2M block
> loop1 loop1 86.6M block
> loop2 loop2 1.6M block
> loop3 loop3 3.3M block
> loop4 loop4 21M block
> loop5 loop5 2.3M block
> loop6 loop6 13M block
> loop7 loop7 3.7M block
> loop8 loop8 2.3M block
> loop9 loop9 86.9M block
> loop10 loop10 34.7M block
> loop11 loop11 87M block
> loop12 loop12 140.9M block
> loop13 loop13 13M block
> loop14 loop14 140M block
> loop15 loop15 139.5M block
> loop16 loop16 3.7M block
> loop17 loop17 14.5M block
> sda sda QEMU HARDDISK 0:0:0:0 32G ATA block:scsi:pci
> sda1 sda1 32G block:scsi:pci
> sdb sdb NVMe disk 2:0:0:0 27.3G INTEL block:scsi:virtio:pci
> sr0 sr0 QEMU DVD-ROM 1:0:0:0 1024M QEMU block:scsi:pci
> ==================================================================
> =========
>
>
> Does anyone can give me help how to solve this problem ?
>
> Thanks.
> Adam Chang
_______________________________________________
SPDK mailing list
SPDK(a)lists.01.org<mailto:SPDK(a)lists.01.org>
https://lists.01.org/mailman/listinfo/spdk
[-- Attachment #2: attachment.html --]
[-- Type: text/html, Size: 122965 bytes --]
^ permalink raw reply [flat|nested] 9+ messages in thread
* Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
@ 2018-08-09 10:56 Adam Chang
0 siblings, 0 replies; 9+ messages in thread
From: Adam Chang @ 2018-08-09 10:56 UTC (permalink / raw)
To: spdk
[-- Attachment #1: Type: text/plain, Size: 60438 bytes --]
Hi:
I have added "-numa node,memdev=mem0" in QEMU command line, but still had
same error message.
Here are my modified QEMU command argument
==================================================================
taskset -c 2,3,4,5 qemu-system-x86_64 -enable-kvm -m 1G \
-name bread,debug-threads=on \
-daemonize \
-pidfile /var/log/bread.pid \
-cpu host\
-smp 4,sockets=1,cores=4,threads=1 \
-object
memory-backend-file,id=mem0,size=1G,mem-path=/dev/hugepages,share=on,prealloc=yes,host-nodes=0,policy=bind
-numa node,memdev=mem0\
-drive
file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
-chardev socket,id=char0,path=/var/tmp/vhost.0 \
-device vhost-user-scsi-pci,id=scsi0,chardev=char0\
-machine usb=on \
-device usb-tablet \
-device usb-mouse \
-device usb-kbd \
-vnc :2 \
-net nic,model=virtio\
-net user,hostfwd=tcp::2222-:22
==================================================================
And the following is the vhost log from QEMU starting:
==================================================================
VHOST_CONFIG: new vhost user connection is 18
VHOST_CONFIG: new device, handle is 0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message
VHOST_USER_GET_PROTOCOL_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message
VHOST_USER_SET_PROTOCOL_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_QUEUE_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_OWNER
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:0 file:25
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:1 file:26
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:2 file:27
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_BASE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_ADDR
VHOST_CONFIG: guest memory region 0, size: 0x40000000
guest physical addr: 0x0
guest virtual addr: 0x7fa1a4a00000
host virtual addr: 0x7f8fb4000000
mmap addr : 0x7f8fb4000000
mmap size : 0x40000000
mmap align: 0x200000
mmap off : 0x0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
VHOST_CONFIG: vring kick idx:2 file:29
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:0 file:30
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:1 file:25
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:2 file:26
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_VRING_BASE
VHOST_CONFIG: vring base idx:2 file:259
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_MEM_TABLE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_BASE
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_ADDR
VHOST_CONFIG: guest memory region 0, size: 0x40000000
guest physical addr: 0x0
guest virtual addr: 0x7fa1a4a00000
host virtual addr: 0x7f8fb4000000
mmap addr : 0x7f8fb4000000
mmap size : 0x40000000
mmap align: 0x200000
mmap off : 0x0
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
VHOST_CONFIG: vring kick idx:0 file:27
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_BASE
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_ADDR
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
VHOST_CONFIG: vring kick idx:1 file:28
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_NUM
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_BASE
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_ADDR
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
VHOST_CONFIG: vring kick idx:2 file:29
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:0 file:31
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:1 file:30
VHOST_CONFIG: virtio is now ready for processing.
VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_CALL
VHOST_CONFIG: vring call idx:2 file:25
VHOST_CONFIG: virtio is now ready for processing.
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc8000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:24 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc8000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:24 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc8000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:24 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc8000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:24 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc8000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:24 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc8000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:24 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7fc9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe7f2a000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fe98a9000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:57149312 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
==================================================================
Thanks,
Adam Chang.
On Thu, Aug 9, 2018 at 4:07 PM Stojaczyk, DariuszX <
dariuszx.stojaczyk(a)intel.com> wrote:
> Can you provide a full vhost log?
> D.
>
> > -----Original Message-----
> > From: SPDK [mailto:spdk-bounces(a)lists.01.org] On Behalf Of Adam Chang
> > Sent: Thursday, August 9, 2018 4:05 AM
> > To: spdk(a)lists.01.org
> > Subject: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
> >
> > Hi all:
> > I just create NVMe bdev and vhost-scsi controller which can be
> accessed by
> > QEMU, but it occurred error when IO issued from VM.
> > Here are my steps for SPDK configuration
> >
> > Host OS:Ubuntu 18.04, Kernel 4.15.0-30
> > Guest OS: Ubuntu 18.04
> > QEMU: 2.12.0
> > SPDK: v18.07
> >
> > 1) sudo HUGEMEM=4096 scripts/setup.sh
> >
> > 0000:05:00.0 (8086 2522): nvme -> vfio-pci
> >
> > Current user memlock limit: 4116 MB
> >
> > This is the maximum amount of memory you will be
> > able to use with DPDK and VFIO if run as current user.
> > To change this, please adjust limits.conf memlock limit for current user.
> >
> > 2) sudo ./app/vhost/vhost -S /var/tmp -m 0x3 &
> >
> > [ DPDK EAL parameters: vhost -c 0x3 -m 1024 --legacy-mem --file-
> > prefix=spdk_pid1921 ]
> > EAL: Detected 12 lcore(s)
> > EAL: Detected 1 NUMA nodes
> > EAL: Multi-process socket /var/run/dpdk/spdk_pid1921/mp_socket
> > EAL: No free hugepages reported in hugepages-1048576kB
> > EAL: Probing VFIO support...
> > EAL: VFIO support initialized
> > app.c: 530:spdk_app_start: *NOTICE*: Total cores available: 2
> > reactor.c: 718:spdk_reactors_init: *NOTICE*: Occupied cpu socket mask is
> 0x1
> > reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on core 1 on
> socket
> > 0
> > reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on core 0 on
> socket
> > 0
> >
> > 3) sudo ./scripts/rpc.py construct_vhost_scsi_controller --cpumask 0x1
> vhost.0
> > EAL: PCI device 0000:05:00.0 on NUMA socket 0
> > EAL: probe driver: 8086:2522 spdk_nvme
> > EAL: using IOMMU type 1 (Type 1)
> > Nvme0n1
> >
> > 4) sudo ./scripts/rpc.py add_vhost_scsi_lun vhost.0 0 Nvme0n1
> > 5) start qemu:
> > taskset qemu-system-x86_64 -enable-kvm -m 1G \
> > -name bread,debug-threads=on \
> > -daemonize \
> > -pidfile /var/log/bread.pid \
> > -cpu host\
> > -smp 4,sockets=1,cores=4,threads=1 \
> > -object memory-backend-file,id=mem0,size=1G,mem-
> > path=/dev/hugepages,share=on -numa node,memdev=mem0\
> > -drive
> > file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
> > -chardev socket,id=char0,path=/var/tmp/vhost.0 \
> > -device vhost-user-scsi-pci,id=scsi0,chardev=char0\
> > -machine usb=on \
> > -device usb-tablet \
> > -device usb-mouse \
> > -device usb-kbd \
> > -vnc :2 \
> > -net nic,model=virtio\
> > -net user,hostfwd=tcp::2222-:22
> >
> > then when I use fio to test the vhost nvme disk in guest VM, I got the
> following
> > error message in host console.
> > ==================================================================
> > =========
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:32
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:32
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:32
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:32
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:32
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:32
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7f8fed64d000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > ==================================================================
> > =========
> >
> > I used the lsblk to check block device information in guest, and could
> see the
> > nvme disk with sdb.
> > >lsblk --output "NAME,KNAME,MODEL,HCTL,SIZE,VENDOR,SUBSYSTEMS"
> > ==================================================================
> > =========
> >
> > NAME KNAME MODEL HCTL SIZE VENDOR SUBSYSTEMS
> > fd0 fd0 4K block:platform
> > loop0 loop0 12.2M block
> > loop1 loop1 86.6M block
> > loop2 loop2 1.6M block
> > loop3 loop3 3.3M block
> > loop4 loop4 21M block
> > loop5 loop5 2.3M block
> > loop6 loop6 13M block
> > loop7 loop7 3.7M block
> > loop8 loop8 2.3M block
> > loop9 loop9 86.9M block
> > loop10 loop10 34.7M block
> > loop11 loop11 87M block
> > loop12 loop12 140.9M block
> > loop13 loop13 13M block
> > loop14 loop14 140M block
> > loop15 loop15 139.5M block
> > loop16 loop16 3.7M block
> > loop17 loop17 14.5M block
> > sda sda QEMU HARDDISK 0:0:0:0 32G ATA block:scsi:pci
> > sda1 sda1 32G block:scsi:pci
> > sdb sdb NVMe disk 2:0:0:0 27.3G INTEL
> block:scsi:virtio:pci
> > sr0 sr0 QEMU DVD-ROM 1:0:0:0 1024M QEMU block:scsi:pci
> > ==================================================================
> > =========
> >
> >
> > Does anyone can give me help how to solve this problem ?
> >
> > Thanks.
> > Adam Chang
> _______________________________________________
> SPDK mailing list
> SPDK(a)lists.01.org
> https://lists.01.org/mailman/listinfo/spdk
>
[-- Attachment #2: attachment.html --]
[-- Type: text/html, Size: 68478 bytes --]
^ permalink raw reply [flat|nested] 9+ messages in thread
* Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
@ 2018-08-09 8:07 Stojaczyk, DariuszX
0 siblings, 0 replies; 9+ messages in thread
From: Stojaczyk, DariuszX @ 2018-08-09 8:07 UTC (permalink / raw)
To: spdk
[-- Attachment #1: Type: text/plain, Size: 9835 bytes --]
Can you provide a full vhost log?
D.
> -----Original Message-----
> From: SPDK [mailto:spdk-bounces(a)lists.01.org] On Behalf Of Adam Chang
> Sent: Thursday, August 9, 2018 4:05 AM
> To: spdk(a)lists.01.org
> Subject: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
>
> Hi all:
> I just create NVMe bdev and vhost-scsi controller which can be accessed by
> QEMU, but it occurred error when IO issued from VM.
> Here are my steps for SPDK configuration
>
> Host OS:Ubuntu 18.04, Kernel 4.15.0-30
> Guest OS: Ubuntu 18.04
> QEMU: 2.12.0
> SPDK: v18.07
>
> 1) sudo HUGEMEM=4096 scripts/setup.sh
>
> 0000:05:00.0 (8086 2522): nvme -> vfio-pci
>
> Current user memlock limit: 4116 MB
>
> This is the maximum amount of memory you will be
> able to use with DPDK and VFIO if run as current user.
> To change this, please adjust limits.conf memlock limit for current user.
>
> 2) sudo ./app/vhost/vhost -S /var/tmp -m 0x3 &
>
> [ DPDK EAL parameters: vhost -c 0x3 -m 1024 --legacy-mem --file-
> prefix=spdk_pid1921 ]
> EAL: Detected 12 lcore(s)
> EAL: Detected 1 NUMA nodes
> EAL: Multi-process socket /var/run/dpdk/spdk_pid1921/mp_socket
> EAL: No free hugepages reported in hugepages-1048576kB
> EAL: Probing VFIO support...
> EAL: VFIO support initialized
> app.c: 530:spdk_app_start: *NOTICE*: Total cores available: 2
> reactor.c: 718:spdk_reactors_init: *NOTICE*: Occupied cpu socket mask is 0x1
> reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on core 1 on socket
> 0
> reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on core 0 on socket
> 0
>
> 3) sudo ./scripts/rpc.py construct_vhost_scsi_controller --cpumask 0x1 vhost.0
> EAL: PCI device 0000:05:00.0 on NUMA socket 0
> EAL: probe driver: 8086:2522 spdk_nvme
> EAL: using IOMMU type 1 (Type 1)
> Nvme0n1
>
> 4) sudo ./scripts/rpc.py add_vhost_scsi_lun vhost.0 0 Nvme0n1
> 5) start qemu:
> taskset qemu-system-x86_64 -enable-kvm -m 1G \
> -name bread,debug-threads=on \
> -daemonize \
> -pidfile /var/log/bread.pid \
> -cpu host\
> -smp 4,sockets=1,cores=4,threads=1 \
> -object memory-backend-file,id=mem0,size=1G,mem-
> path=/dev/hugepages,share=on -numa node,memdev=mem0\
> -drive
> file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
> -chardev socket,id=char0,path=/var/tmp/vhost.0 \
> -device vhost-user-scsi-pci,id=scsi0,chardev=char0\
> -machine usb=on \
> -device usb-tablet \
> -device usb-mouse \
> -device usb-kbd \
> -vnc :2 \
> -net nic,model=virtio\
> -net user,hostfwd=tcp::2222-:22
>
> then when I use fio to test the vhost nvme disk in guest VM, I got the following
> error message in host console.
> ==================================================================
> =========
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:32
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:32
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:32
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:32
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:32
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:32
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> vtophys(0x7f8fed64d000) failed
> nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> cid:95 nsid:1 lba:0 len:8
> nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> ==================================================================
> =========
>
> I used the lsblk to check block device information in guest, and could see the
> nvme disk with sdb.
> >lsblk --output "NAME,KNAME,MODEL,HCTL,SIZE,VENDOR,SUBSYSTEMS"
> ==================================================================
> =========
>
> NAME KNAME MODEL HCTL SIZE VENDOR SUBSYSTEMS
> fd0 fd0 4K block:platform
> loop0 loop0 12.2M block
> loop1 loop1 86.6M block
> loop2 loop2 1.6M block
> loop3 loop3 3.3M block
> loop4 loop4 21M block
> loop5 loop5 2.3M block
> loop6 loop6 13M block
> loop7 loop7 3.7M block
> loop8 loop8 2.3M block
> loop9 loop9 86.9M block
> loop10 loop10 34.7M block
> loop11 loop11 87M block
> loop12 loop12 140.9M block
> loop13 loop13 13M block
> loop14 loop14 140M block
> loop15 loop15 139.5M block
> loop16 loop16 3.7M block
> loop17 loop17 14.5M block
> sda sda QEMU HARDDISK 0:0:0:0 32G ATA block:scsi:pci
> sda1 sda1 32G block:scsi:pci
> sdb sdb NVMe disk 2:0:0:0 27.3G INTEL block:scsi:virtio:pci
> sr0 sr0 QEMU DVD-ROM 1:0:0:0 1024M QEMU block:scsi:pci
> ==================================================================
> =========
>
>
> Does anyone can give me help how to solve this problem ?
>
> Thanks.
> Adam Chang
^ permalink raw reply [flat|nested] 9+ messages in thread
* [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
@ 2018-08-09 2:04 Adam Chang
0 siblings, 0 replies; 9+ messages in thread
From: Adam Chang @ 2018-08-09 2:04 UTC (permalink / raw)
To: spdk
[-- Attachment #1: Type: text/plain, Size: 9148 bytes --]
Hi all:
I just create NVMe bdev and vhost-scsi controller which can be
accessed by QEMU, but it occurred error when IO issued from VM.
Here are my steps for SPDK configuration
Host OS:Ubuntu 18.04, Kernel 4.15.0-30
Guest OS: Ubuntu 18.04
QEMU: 2.12.0
SPDK: v18.07
1) sudo HUGEMEM=4096 scripts/setup.sh
0000:05:00.0 (8086 2522): nvme -> vfio-pci
Current user memlock limit: 4116 MB
This is the maximum amount of memory you will be
able to use with DPDK and VFIO if run as current user.
To change this, please adjust limits.conf memlock limit for current user.
2) sudo ./app/vhost/vhost -S /var/tmp -m 0x3 &
[ DPDK EAL parameters: vhost -c 0x3 -m 1024 --legacy-mem
--file-prefix=spdk_pid1921 ]
EAL: Detected 12 lcore(s)
EAL: Detected 1 NUMA nodes
EAL: Multi-process socket /var/run/dpdk/spdk_pid1921/mp_socket
EAL: No free hugepages reported in hugepages-1048576kB
EAL: Probing VFIO support...
EAL: VFIO support initialized
app.c: 530:spdk_app_start: *NOTICE*: Total cores available: 2
reactor.c: 718:spdk_reactors_init: *NOTICE*: Occupied cpu socket mask is 0x1
reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on core 1 on
socket 0
reactor.c: 492:_spdk_reactor_run: *NOTICE*: Reactor started on core 0 on
socket 0
3) sudo ./scripts/rpc.py construct_vhost_scsi_controller --cpumask 0x1
vhost.0
EAL: PCI device 0000:05:00.0 on NUMA socket 0
EAL: probe driver: 8086:2522 spdk_nvme
EAL: using IOMMU type 1 (Type 1)
Nvme0n1
4) sudo ./scripts/rpc.py add_vhost_scsi_lun vhost.0 0 Nvme0n1
5) start qemu:
taskset qemu-system-x86_64 -enable-kvm -m 1G \
-name bread,debug-threads=on \
-daemonize \
-pidfile /var/log/bread.pid \
-cpu host\
-smp 4,sockets=1,cores=4,threads=1 \
-object
memory-backend-file,id=mem0,size=1G,mem-path=/dev/hugepages,share=on -numa
node,memdev=mem0\
-drive
file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
-chardev socket,id=char0,path=/var/tmp/vhost.0 \
-device vhost-user-scsi-pci,id=scsi0,chardev=char0\
-machine usb=on \
-device usb-tablet \
-device usb-mouse \
-device usb-kbd \
-vnc :2 \
-net nic,model=virtio\
-net user,hostfwd=tcp::2222-:22
then when I use fio to test the vhost nvme disk in guest VM, I got the
following error message in host console.
===========================================================================
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:32
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:32
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:32
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:32
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:32
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:32
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
vtophys(0x7f8fed64d000) failed
nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95
nsid:1 lba:0 len:8
nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
(00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
===========================================================================
I used the lsblk to check block device information in guest, and could see
the nvme disk with sdb.
>lsblk --output "NAME,KNAME,MODEL,HCTL,SIZE,VENDOR,SUBSYSTEMS"
===========================================================================
NAME KNAME MODEL HCTL SIZE VENDOR SUBSYSTEMS
fd0 fd0 4K block:platform
loop0 loop0 12.2M block
loop1 loop1 86.6M block
loop2 loop2 1.6M block
loop3 loop3 3.3M block
loop4 loop4 21M block
loop5 loop5 2.3M block
loop6 loop6 13M block
loop7 loop7 3.7M block
loop8 loop8 2.3M block
loop9 loop9 86.9M block
loop10 loop10 34.7M block
loop11 loop11 87M block
loop12 loop12 140.9M block
loop13 loop13 13M block
loop14 loop14 140M block
loop15 loop15 139.5M block
loop16 loop16 3.7M block
loop17 loop17 14.5M block
sda sda QEMU HARDDISK 0:0:0:0 32G ATA block:scsi:pci
sda1 sda1 32G block:scsi:pci
sdb sdb NVMe disk 2:0:0:0 27.3G INTEL
block:scsi:virtio:pci
sr0 sr0 QEMU DVD-ROM 1:0:0:0 1024M QEMU block:scsi:pci
===========================================================================
Does anyone can give me help how to solve this problem ?
Thanks.
Adam Chang
[-- Attachment #2: attachment.html --]
[-- Type: text/html, Size: 12789 bytes --]
^ permalink raw reply [flat|nested] 9+ messages in thread
end of thread, other threads:[~2018-08-10 8:54 UTC | newest]
Thread overview: 9+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2018-08-09 6:20 [SPDK] Error when issue IO in QEMU to vhost scsi NVMe Wodkowski, PawelX
-- strict thread matches above, loose matches on Subject: below --
2018-08-10 8:54 Adam Chang
2018-08-10 5:14 Stojaczyk, DariuszX
2018-08-10 5:01 Adam Chang
2018-08-09 13:55 Stojaczyk, DariuszX
2018-08-09 12:42 Wodkowski, PawelX
2018-08-09 10:56 Adam Chang
2018-08-09 8:07 Stojaczyk, DariuszX
2018-08-09 2:04 Adam Chang
This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.