All of lore.kernel.org
 help / color / mirror / Atom feed
From: Adam Chang <adamchang at qnap.com>
To: spdk@lists.01.org
Subject: Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
Date: Fri, 10 Aug 2018 16:54:35 +0800	[thread overview]
Message-ID: <CANvoUxg1hAsQt6dGVuCjBn_4DFwqvqbGpAzHiqGaXgorqXYjRw@mail.gmail.com> (raw)
In-Reply-To: FBE7E039FA50BF47A673AD0BD3CD56A8461BE468@HASMSX105.ger.corp.intel.com

[-- Attachment #1: Type: text/plain, Size: 71556 bytes --]

Hi all:

After apply the patch testing, the IO from VM can issue to vhost scsi NVMe,
Thank you for help!

Adam Chang.

On Fri, Aug 10, 2018 at 1:14 PM Stojaczyk, DariuszX <
dariuszx.stojaczyk(a)intel.com> wrote:

> > [ 4640.033876] DMAR: intel_iommu_map: iommu width (39) is not sufficient
> for
> > the mapped address (7fdce8200000)
>
> Thanks. It is SPDK's fault. I already pushed a patch to fix it:
> https://review.gerrithub.io/c/spdk/spdk/+/421697
> D.
>
> > -----Original Message-----
> > From: SPDK [mailto:spdk-bounces(a)lists.01.org] On Behalf Of Adam Chang
> > Sent: Friday, August 10, 2018 7:02 AM
> > To: Storage Performance Development Kit <spdk(a)lists.01.org>
> > Subject: Re: [SPDK] Error when issue IO in QEMU to vhost scsi NVMe
> >
> > Hi:
> > Here are My host environment
> > ==================================================================
> > Host OS: Ubuntu 18.04 x86_64
> > Linux Kernel: 4.15.0-30
> > CPU: Intel i7 8700K
> > Memory: 32GB
> > NVME SSD: Intel Optane Memory 32GB
> > ==================================================================
> > configuration for building QEMU:
> > ==================================================================
> > ./configure --prefix=/usr --target-list=x86_64-softmmu --enable-kvm
> --enable-
> > debug --enable-debug-info --enable-modules --enable-linux-aio
> --enable-vnc --
> > enable-trace-backends=log --enable-numa --disable-werror --disable-strip
> --with-
> > sdlabi=2.0
> > ==================================================================
> >
> > configuration for building SPDK:
> > ==================================================================
> > ./configure --enable-debug
> >
> > ==================================================================
> >
> > I checked the dmesg, it showed the following error
> >
> > ==================================================================
> >
> > [ 4640.033876] DMAR: intel_iommu_map: iommu width (39) is not sufficient
> for
> > the mapped address (7fdce8200000)
> >
> > ==================================================================
> >
> >
> > I add log option when start the vhost target as follow:
> > ==================================================================
> > ./app/vhost/vhost -S /var/tmp -m 0x3 -L vhost vhost_scsi &
> > ==================================================================
> >
> >
> > And here are my vhost log
> > ==================================================================
> >  VHOST_CONFIG: new vhost user connection is 18
> > VHOST_CONFIG: new device, handle is 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_PROTOCOL_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_PROTOCOL_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_QUEUE_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_OWNER
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_GET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:0 file:25
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:1 file:26
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > VHOST_CONFIG: vring call idx:2 file:27
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > VHOST_CONFIG: guest memory region 0, size: 0x40000000
> > guest physical addr: 0x0
> > guest virtual  addr: 0x7f4754600000
> > host  virtual  addr: 0x7fdce8000000
> > mmap addr : 0x7fdce8000000
> > mmap size : 0x40000000
> > mmap align: 0x200000
> > mmap off  : 0x0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> > VHOST_CONFIG: vring kick idx:2 file:29
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > Cannot set up DMA mapping, error 14
> > vhost.c: 541:spdk_vhost_dev_mem_register: *WARNING*: Failed to register
> > memory region 0. Future vtophys translation might fail.
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring call idx:0 file:30
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring call idx:1 file:25
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring call idx:2 file:26
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_VRING_BASE
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring base idx:2 file:259
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_FEATURES
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_MEM_TABLE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > VHOST_CONFIG: guest memory region 0, size: 0x40000000
> > guest physical addr: 0x0
> > guest virtual  addr: 0x7f4754600000
> > host  virtual  addr: 0x7fdce8000000
> > mmap addr : 0x7fdce8000000
> > mmap size : 0x40000000
> > mmap align: 0x200000
> > mmap off  : 0x0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> > VHOST_CONFIG: vring kick idx:0 file:27
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring kick idx:1 file:28
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_NUM
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_BASE
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_ADDR
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message VHOST_USER_SET_VRING_KICK
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring kick idx:2 file:29
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring call idx:0 file:31
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring call idx:1 file:30
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_CALL
> > vhost_scsi.c:1141:destroy_device_poller_cb: *INFO*: Stopping poller for
> vhost
> > controller vhost.0
> > VHOST_CONFIG: vring call idx:2 file:25
> > VHOST_CONFIG: virtio is now ready for processing.
> > vhost.c: 537:spdk_vhost_dev_mem_register: *INFO*: Registering VM memory
> for
> > vtophys translation - 0x7fdce8000000 len:0x40000000
> > vhost_scsi.c:1099:spdk_vhost_scsi_start: *INFO*: Started poller for vhost
> > controller vhost.0 on lcore 0
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7fdd1c467000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1511:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> > vtophys(0x7fdd1c467000) failed
> > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1
> > cid:95 nsid:1 lba:0 len:8
> > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID FIELD
> > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> > bdev_nvme.c:1511:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc = -22
> > ==================================================================
> >
> > Thanks,
> > Adam Chang.
> >
> > On Thu, Aug 9, 2018 at 9:55 PM Stojaczyk, DariuszX
> > <dariuszx.stojaczyk(a)intel.com <mailto:dariuszx.stojaczyk(a)intel.com> >
> wrote:
> >
> >
> >       Thanks,
> >       The address that vtophys fails on should be mapped. Something went
> > wrong, but that vhost log is not particularly helpful because it comes
> from a non-
> > debug app.
> >       I could ask you to enable debug (./configure --enable-debug), but
> could
> > you frst provide the dmesg? Are there any errors?
> >       D.
> >
> >       > -----Original Message-----
> >       > From: SPDK [mailto:spdk-bounces(a)lists.01.org <mailto:spdk-
> > bounces(a)lists.01.org> ] On Behalf Of Adam Chang
> >       > Sent: Thursday, August 9, 2018 12:56 PM
> >       > To: Storage Performance Development Kit <spdk(a)lists.01.org
> > <mailto:spdk(a)lists.01.org> >
> >       > Subject: Re: [SPDK] Error when issue IO in QEMU to vhost scsi
> NVMe
> >       >
> >       > Hi:
> >       > I have added "-numa node,memdev=mem0" in QEMU command line,
> > but still had
> >       > same error message.
> >       > Here are my modified QEMU command argument
> >       >
> >       >
> >       >
> > ==================================================================
> >       > taskset -c 2,3,4,5 qemu-system-x86_64 -enable-kvm -m 1G \
> >       >         -name bread,debug-threads=on \
> >       >         -daemonize \
> >       >         -pidfile /var/log/bread.pid \
> >       >         -cpu host\
> >       >         -smp 4,sockets=1,cores=4,threads=1 \
> >       >         -object memory-backend-file,id=mem0,size=1G,mem-
> >       >
> path=/dev/hugepages,share=on,prealloc=yes,host-nodes=0,policy=bind
> > -numa
> >       > node,memdev=mem0\
> >       >         -drive
> >       >
> > file=../ubuntu.img,media=disk,cache=unsafe,aio=threads,format=qcow2\
> >       > -chardev socket,id=char0,path=/var/tmp/vhost.0 \
> >       > -device vhost-user-scsi-pci,id=scsi0,chardev=char0\
> >       >         -machine usb=on \
> >       >         -device usb-tablet \
> >       >         -device usb-mouse \
> >       >         -device usb-kbd \
> >       >         -vnc :2 \
> >       > -net nic,model=virtio\
> >       > -net user,hostfwd=tcp::2222-:22
> >       >
> > ==================================================================
> >       >
> >       > And the following is the vhost log from QEMU starting:
> >       >
> > ==================================================================
> >       > VHOST_CONFIG: new vhost user connection is 18
> >       > VHOST_CONFIG: new device, handle is 0
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_FEATURES
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_GET_PROTOCOL_FEATURES
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_PROTOCOL_FEATURES
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_GET_QUEUE_NUM
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_OWNER
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_GET_FEATURES
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_CALL
> >       > VHOST_CONFIG: vring call idx:0 file:25
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_CALL
> >       > VHOST_CONFIG: vring call idx:1 file:26
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_CALL
> >       > VHOST_CONFIG: vring call idx:2 file:27
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_FEATURES
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_NUM
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_BASE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_ADDR
> >       > VHOST_CONFIG: guest memory region 0, size: 0x40000000
> >       > guest physical addr: 0x0
> >       > guest virtual  addr: 0x7fa1a4a00000
> >       > host  virtual  addr: 0x7f8fb4000000
> >       > mmap addr : 0x7f8fb4000000
> >       > mmap size : 0x40000000
> >       > mmap align: 0x200000
> >       > mmap off  : 0x0
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_KICK
> >       > VHOST_CONFIG: vring kick idx:2 file:29
> >       > VHOST_CONFIG: virtio is now ready for processing.
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_CALL
> >       > VHOST_CONFIG: vring call idx:0 file:30
> >       > VHOST_CONFIG: virtio is now ready for processing.
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_CALL
> >       > VHOST_CONFIG: vring call idx:1 file:25
> >       > VHOST_CONFIG: virtio is now ready for processing.
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_CALL
> >       > VHOST_CONFIG: vring call idx:2 file:26
> >       > VHOST_CONFIG: virtio is now ready for processing.
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_GET_VRING_BASE
> >       > VHOST_CONFIG: vring base idx:2 file:259
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_FEATURES
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_MEM_TABLE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_NUM
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_BASE
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_ADDR
> >       > VHOST_CONFIG: guest memory region 0, size: 0x40000000
> >       > guest physical addr: 0x0
> >       > guest virtual  addr: 0x7fa1a4a00000
> >       > host  virtual  addr: 0x7f8fb4000000
> >       > mmap addr : 0x7f8fb4000000
> >       > mmap size : 0x40000000
> >       > mmap align: 0x200000
> >       > mmap off  : 0x0
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_KICK
> >       > VHOST_CONFIG: vring kick idx:0 file:27
> >       > VHOST_CONFIG: virtio is now ready for processing.
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_NUM
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_BASE
> >       > VHOST_CONFIG: virtio is now ready for processing.
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_ADDR
> >       > VHOST_CONFIG: virtio is now ready for processing.
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_KICK
> >       > VHOST_CONFIG: vring kick idx:1 file:28
> >       > VHOST_CONFIG: virtio is now ready for processing.
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_NUM
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_BASE
> >       > VHOST_CONFIG: virtio is now ready for processing.
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_ADDR
> >       > VHOST_CONFIG: virtio is now ready for processing.
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> > VHOST_USER_SET_VRING_KICK
> >       > VHOST_CONFIG: vring kick idx:2 file:29
> >       > VHOST_CONFIG: virtio is now ready for processing.
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_CALL
> >       > VHOST_CONFIG: vring call idx:0 file:31
> >       > VHOST_CONFIG: virtio is now ready for processing.
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_CALL
> >       > VHOST_CONFIG: vring call idx:1 file:30
> >       > VHOST_CONFIG: virtio is now ready for processing.
> >       > VHOST_CONFIG: /var/tmp/vhost.0: read message
> >       > VHOST_USER_SET_VRING_CALL
> >       > VHOST_CONFIG: vring call idx:2 file:25
> >       > VHOST_CONFIG: virtio is now ready for processing.
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc8000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:24 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc8000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:24 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc8000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:24 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc8000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:24 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc8000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:24 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc8000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:24 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > sqid:1
> >       > cid:95 nsid:1 lba:0 len:8
> >       > nvme_qpair.c: 306:nvme_qpair_print_completion: *NOTICE*: INVALID
> > FIELD
> >       > (00/02) sqid:1 cid:95 cdw0:0 sqhd:0000 p:0 m:0 dnr:1
> >       > bdev_nvme.c:1521:bdev_nvme_queue_cmd: *ERROR*: readv failed: rc
> > = -22
> >       > nvme_pcie.c:1706:nvme_pcie_prp_list_append: *ERROR*:
> >       > vtophys(0x7f8fe7fc9000) failed
> >       > nvme_qpair.c: 137:nvme_io_qpair_print_command: *NOTICE*: READ
> > s

[-- Attachment #2: attachment.html --]
[-- Type: text/html, Size: 84222 bytes --]

             reply	other threads:[~2018-08-10  8:54 UTC|newest]

Thread overview: 9+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2018-08-10  8:54 Adam Chang [this message]
  -- strict thread matches above, loose matches on Subject: below --
2018-08-10  5:14 [SPDK] Error when issue IO in QEMU to vhost scsi NVMe Stojaczyk, DariuszX
2018-08-10  5:01 Adam Chang
2018-08-09 13:55 Stojaczyk, DariuszX
2018-08-09 12:42 Wodkowski, PawelX
2018-08-09 10:56 Adam Chang
2018-08-09  8:07 Stojaczyk, DariuszX
2018-08-09  6:20 Wodkowski, PawelX
2018-08-09  2:04 Adam Chang

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=CANvoUxg1hAsQt6dGVuCjBn_4DFwqvqbGpAzHiqGaXgorqXYjRw@mail.gmail.com \
    --to=spdk@lists.01.org \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.