FailedConsole Output

Started by upstream project "doctor-verify-master" build number 142
originally caused by:
 Triggered by Gerrit: https://gerrit.opnfv.org/gerrit/66019
[EnvInject] - Loading node environment variables.
Building remotely on nokia-pod1 (nokia opnfv-sysinfo doctor-apex-x86_64) in workspace /home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master
[ssh-agent] Looking for ssh-agent implementation...
[ssh-agent]   Exec ssh-agent (binary ssh-agent on a remote machine)
$ ssh-agent
SSH_AUTH_SOCK=/tmp/ssh-4CbjiYzCgjXZ/agent.38212
SSH_AGENT_PID=38214
[ssh-agent] Started.
Running ssh-add (command line suppressed)
Identity added: /home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master@tmp/private_key_7681278021475266567.key (/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master@tmp/private_key_7681278021475266567.key)
[ssh-agent] Using credentials jenkins-ci (Jenkins Master SSH)
using credential d42411ac011ad6f3dd2e1fa34eaa5d87f910eb2e
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://gerrit.opnfv.org/gerrit/doctor
 > git init /home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master # timeout=10
Fetching upstream changes from https://gerrit.opnfv.org/gerrit/doctor
 > git --version # timeout=10
using GIT_SSH to set credentials Jenkins Master SSH
 > git fetch --tags --progress https://gerrit.opnfv.org/gerrit/doctor +refs/heads/*:refs/remotes/origin/* # timeout=15
 > git config remote.origin.url https://gerrit.opnfv.org/gerrit/doctor # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://gerrit.opnfv.org/gerrit/doctor # timeout=10
Fetching upstream changes from https://gerrit.opnfv.org/gerrit/doctor
using GIT_SSH to set credentials Jenkins Master SSH
 > git fetch --tags --progress https://gerrit.opnfv.org/gerrit/doctor refs/changes/19/66019/7 # timeout=15
Checking out Revision 73605c5c34b97ab56306bfa9af0f5888f3c7e46d (refs/changes/19/66019/7)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 73605c5c34b97ab56306bfa9af0f5888f3c7e46d # timeout=15
Commit message: "Support Fenix as admin tool"
 > git rev-parse FETCH_HEAD^{commit} # timeout=10
 > git rev-list --no-walk 33293e9c23a21ad3228f46d2063f18c915eb2b79 # timeout=10
No emails were triggered.
[doctor-verify-maintenance-apex-sample-x86_64-master] $ /usr/bin/env bash /tmp/jenkins1840108059538180098.sh
Gathering IP information for Apex installer VM
 6     undercloud                     running
Installer VM detected
Installer ip is 192.168.122.40
fetch_os_creds.info: Fetching rc file...
fetch_os_creds.info: Verifying connectivity to 192.168.122.40...
fetch_os_creds.info: 192.168.122.40 is reachable!
fetch_os_creds.info: ... from Instack VM 192.168.122.40...
Warning: Permanently added '192.168.122.40' (ECDSA) to the list of known hosts.
-------- Credentials: --------
# Clear any old environment that may conflict.
for key in $( set | awk '{FS="="}  /^OS_/ {print $1}' ); do unset $key ; done
export OS_NO_CACHE=True
export COMPUTE_API_VERSION=1.1
export OS_USERNAME=admin
export no_proxy=,192.168.37.19,192.0.2.4
export OS_USER_DOMAIN_NAME=Default
export OS_VOLUME_API_VERSION=3
export OS_CLOUDNAME=overcloud
export OS_AUTH_URL=http://192.168.37.19:5000/v3
export NOVA_VERSION=1.1
export OS_IMAGE_API_VERSION=2
#export OS_PASSWORD=N9flPG9TD7lwww7vsVewDHyBK
export OS_PASSWORD=admin
export OS_PROJECT_DOMAIN_NAME=Default
export OS_IDENTITY_API_VERSION=3
export OS_PROJECT_NAME=admin
export OS_AUTH_TYPE=password
export PYTHONWARNINGS="ignore:Certificate has no, ignore:A true SSLContext object is not available"

# Add OS_CLOUDNAME to PS1
if [ -z "${CLOUDPROMPT_ENABLED:-}" ]; then
    export PS1=${PS1:-""}
    export PS1=\${OS_CLOUDNAME:+"(\$OS_CLOUDNAME)"}\ $PS1
    export CLOUDPROMPT_ENABLED=1
fi
export OS_PROJECT_ID=6e76c202356a46cbbd626ef7c00efc0c
export OS_TENANT_NAME=admin
export OS_REGION_NAME=regionOne
[doctor-verify-maintenance-apex-sample-x86_64-master] $ /bin/sh -xe /tmp/jenkins3279095533310588313.sh
+ source /home/jenkins/opnfv-openrc.sh
+++ set
+++ awk '{FS="="}  /^OS_/ {print $1}'
++ export OS_NO_CACHE=True
++ OS_NO_CACHE=True
++ export COMPUTE_API_VERSION=1.1
++ COMPUTE_API_VERSION=1.1
++ export OS_USERNAME=admin
++ OS_USERNAME=admin
++ export no_proxy=,192.168.37.19,192.0.2.4
++ no_proxy=,192.168.37.19,192.0.2.4
++ export OS_USER_DOMAIN_NAME=Default
++ OS_USER_DOMAIN_NAME=Default
++ export OS_VOLUME_API_VERSION=3
++ OS_VOLUME_API_VERSION=3
++ export OS_CLOUDNAME=overcloud
++ OS_CLOUDNAME=overcloud
++ export OS_AUTH_URL=http://192.168.37.19:5000/v3
++ OS_AUTH_URL=http://192.168.37.19:5000/v3
++ export NOVA_VERSION=1.1
++ NOVA_VERSION=1.1
++ export OS_IMAGE_API_VERSION=2
++ OS_IMAGE_API_VERSION=2
++ export OS_PASSWORD=admin
++ OS_PASSWORD=admin
++ export OS_PROJECT_DOMAIN_NAME=Default
++ OS_PROJECT_DOMAIN_NAME=Default
++ export OS_IDENTITY_API_VERSION=3
++ OS_IDENTITY_API_VERSION=3
++ export OS_PROJECT_NAME=admin
++ OS_PROJECT_NAME=admin
++ export OS_AUTH_TYPE=password
++ OS_AUTH_TYPE=password
++ export 'PYTHONWARNINGS=ignore:Certificate has no, ignore:A true SSLContext object is not available'
++ PYTHONWARNINGS='ignore:Certificate has no, ignore:A true SSLContext object is not available'
++ '[' -z '' ']'
++ export PS1=
++ PS1=
++ export 'PS1=${OS_CLOUDNAME:+($OS_CLOUDNAME)} '
++ PS1='${OS_CLOUDNAME:+($OS_CLOUDNAME)} '
++ export CLOUDPROMPT_ENABLED=1
++ CLOUDPROMPT_ENABLED=1
++ export OS_PROJECT_ID=6e76c202356a46cbbd626ef7c00efc0c
++ OS_PROJECT_ID=6e76c202356a46cbbd626ef7c00efc0c
++ export OS_TENANT_NAME=admin
++ OS_TENANT_NAME=admin
++ export OS_REGION_NAME=regionOne
++ OS_REGION_NAME=regionOne
+ '[' -f /home/jenkins/os_cacert ']'
+ source /home/jenkins/opnfv-installer.sh
++ export INSTALLER_TYPE=apex
++ INSTALLER_TYPE=apex
++ export INSTALLER_IP=192.168.122.40
++ INSTALLER_IP=192.168.122.40
++ export SSH_KEY=/home/jenkins/installer_key_file
++ SSH_KEY=/home/jenkins/installer_key_file
+ sudo -E tox -e py34
py34 create: /home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34
py34 installdeps: -r/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/requirements.txt
py34 develop-inst: /home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master
py34 installed: DEPRECATION: Python 3.4 support has been deprecated. pip 19.1 will be the last one supporting it. Please upgrade your Python as Python 3.4 won't be maintained after March 2019 (cf PEP 429).,amqp==2.2.1,aodhclient==0.9.0,appdirs==1.4.3,asn1crypto==0.22.0,Babel==2.3.4,bcrypt==3.1.3,cachetools==2.0.0,certifi==2017.4.17,cffi==1.10.0,chardet==3.0.4,click==6.7,cliff==2.8.2,cmd2==0.7.5,contextlib2==0.5.5,cryptography==2.0.2,debtcollector==1.17.1,deprecation==1.0.1,-e git+https://gerrit.opnfv.org/gerrit/doctor@73605c5c34b97ab56306bfa9af0f5888f3c7e46d#egg=doctor_tests,enum-compat==0.0.2,eventlet==0.20.0,fasteners==0.14.1,flake8==2.5.5,Flask==0.12.2,futurist==1.3.1,greenlet==0.4.15,idna==2.5,iso8601==0.1.11,itsdangerous==0.24,Jinja2==2.9.6,jsonpatch==1.16,jsonpointer==1.10,jsonschema==2.6.0,keystoneauth1==3.1.0,kombu==4.1.0,MarkupSafe==1.0,mccabe==0.4.0,monotonic==1.3,msgpack-python==0.4.8,netaddr==0.7.19,netifaces==0.10.6,openstacksdk==0.9.17,os-client-config==1.28.0,osc-lib==1.7.0,oslo.concurrency==3.21.1,oslo.config==4.11.1,oslo.context==2.17.1,oslo.i18n==3.17.1,oslo.log==3.30.2,oslo.messaging==5.30.7,oslo.middleware==3.30.1,oslo.serialization==2.20.2,oslo.service==1.25.1,oslo.utils==3.28.3,oslo.versionedobjects==1.26.2,paramiko==2.2.1,Paste==2.0.3,PasteDeploy==1.5.2,pbr==3.1.1,pep8==1.7.1,pika==0.10.0,pika-pool==0.1.3,positional==1.1.2,prettytable==0.7.2,pyasn1==0.3.1,pycparser==2.18,pyflakes==1.0.0,pyinotify==0.9.6,PyNaCl==1.1.2,pyOpenSSL==17.2.0,pyparsing==2.2.0,pyperclip==1.5.27,python-ceilometerclient==2.9.0,python-cinderclient==3.1.1,python-congressclient==1.8.0,python-dateutil==2.6.1,python-glanceclient==2.8.0,python-heatclient==1.11.1,python-keystoneclient==3.13.0,python-neutronclient==6.5.0,python-novaclient==9.1.2,python-openstackclient==3.12.1,python-swiftclient==3.4.0,python-vitrageclient==1.4.0,pytz==2017.2,PyYAML==3.12,repoze.lru==0.6,requests==2.18.2,requestsexceptions==1.3.0,rfc3986==1.1.0,Routes==2.4.1,scp==0.10.2,simplejson==3.11.1,six==1.10.0,statsd==3.2.1,stevedore==1.25.1,tenacity==4.4.0,urllib3==1.22,vine==1.1.4,virtualenv==15.1.0,warlock==1.2.0,WebOb==1.7.3,Werkzeug==0.12.2,wrapt==1.10.10
py34 runtests: PYTHONHASHSEED='66709364'
py34 runtests: commands[0] | doctor-test
/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/paramiko/client.py:711: UserWarning: Unknown ssh-ed25519 host key for 192.168.122.40: b'3dc1628c6fe2819031608f780639f40a'
  key.get_fingerprint())))
2019-03-26 16:25:26,239 main.py 130 INFO   doctor test starting.......
2019-03-26 16:25:26,239 apex.py 43 INFO   Setup Apex installer start......
2019-03-26 16:25:26,239 base.py 113 INFO   Get SSH keys from apex installer......
2019-03-26 16:25:26,667 apex.py 67 INFO   Get overcloud config details from Apex installer......
2019-03-26 16:25:26,667 base.py 174 INFO   Run command=source stackrc; nova list | grep ' overcloud-' in apex installer......
2019-03-26 16:25:30,373 base.py 183 INFO   Output=['| 2d0aedd7-4e30-4fee-bbd2-0feb8b1078c4 | overcloud-controller-0  | ACTIVE | -          | Running     | ctlplane=192.0.2.9 |', '| 37ceec17-a130-4826-87f0-67699d3350f5 | overcloud-novacompute-0 | ACTIVE | -          | Running     | ctlplane=192.0.2.6 |', '| 416383de-15fb-477a-9a6c-f0b21df9be70 | overcloud-novacompute-1 | ACTIVE | -          | Running     | ctlplane=192.0.2.7 |', '| 0f025a92-3365-4c3b-951e-dab9a6d9ca22 | overcloud-novacompute-2 | ACTIVE | -          | Running     | ctlplane=192.0.2.3 |'] command=source stackrc; nova list | grep ' overcloud-' in apex installer
2019-03-26 16:25:30,373 base.py 188 INFO   Check command=grep docker /home/stack/deploy_command return in apex installer......
2019-03-26 16:25:30,436 base.py 191 INFO   return 0
2019-03-26 16:25:30,436 apex.py 80 INFO   controller_ips:['192.0.2.9']
2019-03-26 16:25:30,437 apex.py 81 INFO   compute_ips:['192.0.2.6', '192.0.2.7', '192.0.2.3']
2019-03-26 16:25:30,437 apex.py 82 INFO   use_containers:True
2019-03-26 16:25:30,438 base.py 174 INFO   Run command=scp overcloudrc heat-admin@192.0.2.9: in apex installer......
2019-03-26 16:25:30,862 base.py 183 INFO   Output=[] command=scp overcloudrc heat-admin@192.0.2.9: in apex installer
2019-03-26 16:25:31,779 apex.py 102 INFO   Set apply patches start......
/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/paramiko/client.py:711: UserWarning: Unknown ssh-ed25519 host key for 192.0.2.9: b'b3b023c9e950672f64d119fecdae83fd'
  key.get_fingerprint())))
2019-03-26 16:25:32,373 base.py 218 INFO   Command sudo python set_config.py output ['Add event notifier in ceilometer', 'NOTE: add compute.instance.update to event_definitions.yaml', 'NOTE: add maintenance.scheduled to event_definitions.yaml', 'NOTE: add maintenance.host to event_definitions.yaml']
2019-03-26 16:25:33,894 base.py 218 INFO   Command sudo python restart_aodh.py output []
2019-03-26 16:25:37,183 apex.py 148 INFO   Set apply patches start......
/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/paramiko/client.py:711: UserWarning: Unknown ssh-ed25519 host key for 192.0.2.6: b'5ffdaf1f169876d8c9ac845458601952'
  key.get_fingerprint())))
/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/paramiko/client.py:711: UserWarning: Unknown ssh-ed25519 host key for 192.0.2.7: b'50858b8f4760d0c962bbdcb845047215'
  key.get_fingerprint())))
/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/paramiko/client.py:711: UserWarning: Unknown ssh-ed25519 host key for 192.0.2.3: b'890a9b151267bafe34f53791df2c09f2'
  key.get_fingerprint())))
2019-03-26 16:25:37,539 base.py 218 INFO   Command sudo python set_compute_config.py output []
2019-03-26 16:25:37,655 base.py 218 INFO   Command sudo python set_compute_config.py output []
2019-03-26 16:25:37,689 base.py 218 INFO   Command sudo python set_compute_config.py output []
2019-03-26 16:25:42,411 base.py 63 INFO   Setup ssh stunnel in apex installer......
2019-03-26 16:25:42,411 base.py 76 INFO   tunnel for port 12346
2019-03-26 16:25:42,416 base.py 76 INFO   tunnel for port 12348
2019-03-26 16:25:42,421 base.py 76 INFO   tunnel for port 12345
2019-03-26 16:25:42,425 base.py 94 INFO   tunnel for port 12347
2019-03-26 16:25:42,429 image.py 48 INFO   image create start......
2019-03-26 16:26:01,627 image.py 68 INFO   image create end......
2019-03-26 16:26:01,627 user.py 70 INFO   user create start......
2019-03-26 16:26:01,856 user.py 86 INFO   create project......
2019-03-26 16:26:02,825 user.py 95 INFO   test project <Project description=, domain_id=default, enabled=True, id=e2b561e912b9452b8f992eff81ad9a76, is_domain=False, links={'self': 'http://192.0.2.4:35357/v3/projects/e2b561e912b9452b8f992eff81ad9a76'}, name=doctor, parent_id=default, tags=[]>
2019-03-26 16:26:03,091 user.py 103 INFO   create user......
2019-03-26 16:26:08,019 user.py 113 INFO   test user <User domain_id=default, enabled=True, id=cecd5c460ffa4542b4ce7a144acd4b26, links={'self': 'http://192.0.2.4:35357/v3/users/cecd5c460ffa4542b4ce7a144acd4b26'}, name=doctor, options={}, password_expires_at=None>
2019-03-26 16:26:08,287 user.py 127 INFO   role _member_ already created......
2019-03-26 16:26:08,287 user.py 128 INFO   test role <Role domain_id=None, id=9fe2ff9ee4384b1894a90878d3e92bab, links={'self': 'http://192.0.2.4:35357/v3/roles/9fe2ff9ee4384b1894a90878d3e92bab'}, name=_member_>
2019-03-26 16:26:09,721 user.py 78 INFO   user create end......
2019-03-26 16:26:10,606 main.py 104 INFO   doctor maintenance test starting.......
2019-03-26 16:26:13,245 maintenance.py 62 INFO   checking hypervisors.......
2019-03-26 16:26:13,245 maintenance.py 95 INFO   testing 3 computes with 32 vcpus each
2019-03-26 16:26:13,246 maintenance.py 98 INFO   testing 2 actstdby and 4 noredundancy instances
2019-03-26 16:26:13,246 user.py 190 INFO   quota update start......
2019-03-26 16:26:13,246 user.py 206 INFO   default quota update start......
2019-03-26 16:26:14,263 user.py 217 INFO   user quota update start......
2019-03-26 16:26:14,646 user.py 230 INFO   quota update end......
2019-03-26 16:26:15,687 maintenance.py 117 INFO   creating maintenance stack.......
2019-03-26 16:26:15,687 maintenance.py 118 INFO   parameters: {'ext_net': 'external', 'flavor_vcpus': 16, 'ha_intances': 2, 'maint_image': 'cirros', 'nonha_intances': 4}
2019-03-26 16:30:22,644 base.py 218 INFO   Command sudo chmod 700 set_fenix.sh;sudo ./set_fenix.sh output ['e9e1115216ea', 'e9e1115216ea', 'Untagged: fenix:latest', 'Deleted: sha256:397c24ac77e8c59562d132cd0c290820e13c2d01e00e39d419a0445b769d0b1c', 'Deleted: sha256:e49df1c386bd2128a641c4b925a07e270b9fe54605221077e7f66b662782feb8', 'Deleted: sha256:68c8aedd9421b2be439b67b7c5c546ef430d42e43141ad8811e17aef2213778c', 'Deleted: sha256:49b6ab38e2e7d8fd8996e9624e6afd7a7aed90fa6da400fdd8c5a35dad26afec', 'Deleted: sha256:81fd2b623c5ca285e72b24ec23c0c5e348afd4f7be45d6d50261e3d5b1ef9233', 'Deleted: sha256:ea4797aa34736fdcf32f0ad15809ced6dea0db12bc9b0eb8c1dc7e99c554ac52', 'Deleted: sha256:cf92bca8db1e03906d40476cb38d5fb97883992458688030bdffbb15baa6fbe2', 'Deleted: sha256:889a91776a77425edb11b143df3d4dd2fbda9c0f70a528cbdf017b5500085e3c', 'Deleted: sha256:7ee8c78b6cb355000357e1a2e6545de187f7e052ef2ecdc8ec37d7aa5956068f', 'Deleted: sha256:3f6fc6a9048aaee2e6d51c56d70dfd934eb0f895a77502f23222f8f76b80ad16', 'Deleted: sha256:0132dbe8e3c97b750b6a56c9ad2c007ee73a71c2dbe23900bfe4b3b5b1a06af9', 'Deleted: sha256:b9bdf000db582de345e4e739fe77c255a1acae1e3b054d0f7337f70dce1908f4', 'Deleted: sha256:1d132ffbfe57bc950fab0985d06f440661130497972b60320b9b72c3ff15515c', 'Deleted: sha256:df93c8b11898c89bd03a5f99b0a7540902c83216892d67eac67637efc9188210', 'Deleted: sha256:0ea02d3f3a66097ffd86aad378704b3cdc0f03d4c6d09c2c7869c16df1fdd53d', 'Deleted: sha256:0f53d36f0a84352ee4fd637c12f854436e0879aef734f4d2aa63fd61f6fec814', 'Deleted: sha256:c69cc34838a5137782c8d8a696218cb52aec146ebfa24bcafe7f867b9b663c9c', 'Deleted: sha256:dce66edf4c7934b2b52b725ba1ee74c7a547de24faeee9a8e1410f300b12c393', 'Successfully built 0c971558e46d', '07ad2871ba58ef7dd8318803e944e03fb3c9c9e5af3bca24049e2d3f14668d27', 'Fenix start: OK']
2019-03-26 16:30:22,779 base.py 218 INFO   Command sudo python set_compute_config.py output []
2019-03-26 16:33:07,378 stack.py 95 INFO   retry creating maintenance stack.......
2019-03-26 16:33:30,963 stack.py 65 INFO   stack doctor_test_maintenance DELETE_COMPLETE
2019-03-26 16:34:16,426 stack.py 65 INFO   stack doctor_test_maintenance CREATE_COMPLETE
2019-03-26 16:34:16,427 sample.py 31 INFO   sample app manager start......
2019-03-26 16:34:17,834 sample.py 85 INFO   sample inspector start......
 * Running on http://0.0.0.0:12348/ (Press CTRL+C to quit)
2019-03-26 16:34:19,238 main.py 112 INFO   wait aodh for 120s.......
 * Running on http://0.0.0.0:12345/ (Press CTRL+C to quit)
2019-03-26 16:36:19,327 maintenance.py 134 INFO   start maintenance.......
2019-03-26 16:36:23,807 sample.py 164 INFO   sample app manager received data = {'metadata': {'openstack_version': 'Rocky'}, 'state': 'MAINTENANCE', 'reply_url': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'project_id': 'e2b561e912b9452b8f992eff81ad9a76', 'instance_ids': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'actions_at': '2019-03-26T14:36:49', 'service': 'fenix', 'reply_at': '2019-03-26T14:36:03', 'allowed_actions': [], 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
2019-03-26 16:36:23,807 sample.py 170 INFO   sample app manager state: MAINTENANCE
2019-03-26 16:36:23,852 sample.py 120 INFO   get_instance_ids {'instance_ids': ['5b74b0af-0c7c-4077-b8ff-07a5105eef14', '89f9a5f3-0f48-47b7-a329-e76077c6239d', '3e058a71-dd6a-4895-a52a-80892a59f337', '1d394b4c-c68a-47c4-8681-4c254c774bd3', 'c0c4d9f7-97bf-47f1-91b5-1b3e518cb5f1', '27a4ad01-b35b-414a-b208-4e6f720ce99e']}
2019-03-26 16:36:23,853 sample.py 240 INFO   sample app manager reply: {'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a', 'state': 'ACK_MAINTENANCE', 'instance_ids': ['5b74b0af-0c7c-4077-b8ff-07a5105eef14', '89f9a5f3-0f48-47b7-a329-e76077c6239d', '3e058a71-dd6a-4895-a52a-80892a59f337', '1d394b4c-c68a-47c4-8681-4c254c774bd3', 'c0c4d9f7-97bf-47f1-91b5-1b3e518cb5f1', '27a4ad01-b35b-414a-b208-4e6f720ce99e']}
87.254.192.34 - - [26/Mar/2019 16:36:23] "POST /maintenance HTTP/1.1" 200 -
2019-03-26 16:36:49,860 sample.py 164 INFO   sample app manager received data = {'metadata': {'openstack_version': 'Rocky'}, 'state': 'SCALE_IN', 'reply_url': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'project_id': 'e2b561e912b9452b8f992eff81ad9a76', 'instance_ids': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'actions_at': '2019-03-26T14:35:49', 'service': 'fenix', 'reply_at': '2019-03-26T14:35:49', 'allowed_actions': [], 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
2019-03-26 16:36:49,861 sample.py 170 INFO   sample app manager state: SCALE_IN
2019-03-26 16:37:05,120 stack.py 65 INFO   stack doctor_test_maintenance UPDATE_COMPLETE
2019-03-26 16:37:05,201 sample.py 144 INFO   scaled insances from 6 to 4
2019-03-26 16:37:05,286 sample.py 240 INFO   sample app manager reply: {'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a', 'state': 'ACK_SCALE_IN', 'instance_ids': ['89f9a5f3-0f48-47b7-a329-e76077c6239d', 'c0c4d9f7-97bf-47f1-91b5-1b3e518cb5f1', '1d394b4c-c68a-47c4-8681-4c254c774bd3', '27a4ad01-b35b-414a-b208-4e6f720ce99e']}
87.254.192.34 - - [26/Mar/2019 16:37:05] "POST /maintenance HTTP/1.1" 200 -
2019-03-26 16:37:07,468 sample.py 164 INFO   sample app manager received data = {'metadata': {'openstack_version': 'Rocky'}, 'state': 'PREPARE_MAINTENANCE', 'reply_url': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'project_id': 'e2b561e912b9452b8f992eff81ad9a76', 'instance_ids': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'actions_at': '2019-03-26T14:36:47', 'service': 'fenix', 'reply_at': '2019-03-26T14:36:47', 'allowed_actions': ['MIGRATE', 'LIVE_MIGRATE', 'OWN_ACTION'], 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
2019-03-26 16:37:07,468 sample.py 170 INFO   sample app manager state: PREPARE_MAINTENANCE
2019-03-26 16:37:07,513 sample.py 120 INFO   get_instance_ids {'instance_ids': ['89f9a5f3-0f48-47b7-a329-e76077c6239d']}
2019-03-26 16:37:07,514 sample.py 203 INFO   sample app manager got instances: ['89f9a5f3-0f48-47b7-a329-e76077c6239d']
2019-03-26 16:37:07,514 sample.py 240 INFO   sample app manager reply: {'state': 'ACK_PREPARE_MAINTENANCE', 'instance_actions': {'89f9a5f3-0f48-47b7-a329-e76077c6239d': 'MIGRATE'}, 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
87.254.192.34 - - [26/Mar/2019 16:37:07] "POST /maintenance HTTP/1.1" 200 -
2019-03-26 16:38:21,037 sample.py 164 INFO   sample app manager received data = {'metadata': {}, 'state': 'INSTANCE_ACTION_DONE', 'reply_url': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'project_id': 'e2b561e912b9452b8f992eff81ad9a76', 'instance_ids': ['89f9a5f3-0f48-47b7-a329-e76077c6239d'], 'service': 'fenix', 'allowed_actions': [], 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
2019-03-26 16:38:21,038 sample.py 170 INFO   sample app manager state: INSTANCE_ACTION_DONE
2019-03-26 16:38:21,038 sample.py 230 INFO   ['89f9a5f3-0f48-47b7-a329-e76077c6239d']
87.254.192.34 - - [26/Mar/2019 16:38:21] "POST /maintenance HTTP/1.1" 200 -
2019-03-26 16:39:08,028 sample.py 164 INFO   sample app manager received data = {'metadata': {'openstack_version': 'Rocky'}, 'state': 'PLANNED_MAINTENANCE', 'reply_url': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'project_id': 'e2b561e912b9452b8f992eff81ad9a76', 'instance_ids': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'actions_at': '2019-03-26T14:38:47', 'service': 'fenix', 'reply_at': '2019-03-26T14:38:47', 'allowed_actions': ['MIGRATE', 'LIVE_MIGRATE', 'OWN_ACTION'], 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
2019-03-26 16:39:08,029 sample.py 170 INFO   sample app manager state: PLANNED_MAINTENANCE
2019-03-26 16:39:08,074 sample.py 120 INFO   get_instance_ids {'instance_ids': ['1d394b4c-c68a-47c4-8681-4c254c774bd3', '27a4ad01-b35b-414a-b208-4e6f720ce99e']}
2019-03-26 16:39:08,075 sample.py 220 INFO   sample app manager got instances: ['1d394b4c-c68a-47c4-8681-4c254c774bd3', '27a4ad01-b35b-414a-b208-4e6f720ce99e']
2019-03-26 16:39:08,075 sample.py 240 INFO   sample app manager reply: {'state': 'ACK_PLANNED_MAINTENANCE', 'instance_actions': {'1d394b4c-c68a-47c4-8681-4c254c774bd3': 'MIGRATE', '27a4ad01-b35b-414a-b208-4e6f720ce99e': 'MIGRATE'}, 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
87.254.192.34 - - [26/Mar/2019 16:39:08] "POST /maintenance HTTP/1.1" 200 -
2019-03-26 16:40:24,646 sample.py 164 INFO   sample app manager received data = {'metadata': {}, 'state': 'INSTANCE_ACTION_DONE', 'reply_url': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'project_id': 'e2b561e912b9452b8f992eff81ad9a76', 'instance_ids': ['1d394b4c-c68a-47c4-8681-4c254c774bd3'], 'service': 'fenix', 'allowed_actions': [], 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
2019-03-26 16:40:24,647 sample.py 170 INFO   sample app manager state: INSTANCE_ACTION_DONE
2019-03-26 16:40:24,647 sample.py 230 INFO   ['1d394b4c-c68a-47c4-8681-4c254c774bd3']
87.254.192.34 - - [26/Mar/2019 16:40:24] "POST /maintenance HTTP/1.1" 200 -
2019-03-26 16:41:40,740 sample.py 164 INFO   sample app manager received data = {'metadata': {}, 'state': 'INSTANCE_ACTION_DONE', 'reply_url': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'project_id': 'e2b561e912b9452b8f992eff81ad9a76', 'instance_ids': ['27a4ad01-b35b-414a-b208-4e6f720ce99e'], 'service': 'fenix', 'allowed_actions': [], 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
2019-03-26 16:41:40,741 sample.py 170 INFO   sample app manager state: INSTANCE_ACTION_DONE
2019-03-26 16:41:40,741 sample.py 230 INFO   ['27a4ad01-b35b-414a-b208-4e6f720ce99e']
87.254.192.34 - - [26/Mar/2019 16:41:40] "POST /maintenance HTTP/1.1" 200 -
2019-03-26 16:42:08,098 sample.py 164 INFO   sample app manager received data = {'metadata': {'openstack_version': 'Rocky'}, 'state': 'PLANNED_MAINTENANCE', 'reply_url': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'project_id': 'e2b561e912b9452b8f992eff81ad9a76', 'instance_ids': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'actions_at': '2019-03-26T14:41:47', 'service': 'fenix', 'reply_at': '2019-03-26T14:41:47', 'allowed_actions': ['MIGRATE', 'LIVE_MIGRATE', 'OWN_ACTION'], 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
2019-03-26 16:42:08,099 sample.py 170 INFO   sample app manager state: PLANNED_MAINTENANCE
2019-03-26 16:42:08,144 sample.py 120 INFO   get_instance_ids {'instance_ids': ['89f9a5f3-0f48-47b7-a329-e76077c6239d', 'c0c4d9f7-97bf-47f1-91b5-1b3e518cb5f1']}
2019-03-26 16:42:08,144 sample.py 220 INFO   sample app manager got instances: ['89f9a5f3-0f48-47b7-a329-e76077c6239d', 'c0c4d9f7-97bf-47f1-91b5-1b3e518cb5f1']
2019-03-26 16:42:08,144 sample.py 87 INFO   Switch over to: doctor_ha_app_1 27a4ad01-b35b-414a-b208-4e6f720ce99e
2019-03-26 16:42:12,570 sample.py 240 INFO   sample app manager reply: {'state': 'ACK_PLANNED_MAINTENANCE', 'instance_actions': {'c0c4d9f7-97bf-47f1-91b5-1b3e518cb5f1': 'MIGRATE', '89f9a5f3-0f48-47b7-a329-e76077c6239d': 'MIGRATE'}, 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
87.254.192.34 - - [26/Mar/2019 16:42:12] "POST /maintenance HTTP/1.1" 200 -
2019-03-26 16:43:30,840 sample.py 164 INFO   sample app manager received data = {'metadata': {}, 'state': 'INSTANCE_ACTION_DONE', 'reply_url': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'project_id': 'e2b561e912b9452b8f992eff81ad9a76', 'instance_ids': ['89f9a5f3-0f48-47b7-a329-e76077c6239d'], 'service': 'fenix', 'allowed_actions': [], 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
2019-03-26 16:43:30,841 sample.py 170 INFO   sample app manager state: INSTANCE_ACTION_DONE
2019-03-26 16:43:30,841 sample.py 230 INFO   ['89f9a5f3-0f48-47b7-a329-e76077c6239d']
87.254.192.34 - - [26/Mar/2019 16:43:30] "POST /maintenance HTTP/1.1" 200 -
2019-03-26 16:44:46,604 sample.py 164 INFO   sample app manager received data = {'metadata': {}, 'state': 'INSTANCE_ACTION_DONE', 'reply_url': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'project_id': 'e2b561e912b9452b8f992eff81ad9a76', 'instance_ids': ['c0c4d9f7-97bf-47f1-91b5-1b3e518cb5f1'], 'service': 'fenix', 'allowed_actions': [], 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
2019-03-26 16:44:46,604 sample.py 170 INFO   sample app manager state: INSTANCE_ACTION_DONE
2019-03-26 16:44:46,604 sample.py 230 INFO   ['c0c4d9f7-97bf-47f1-91b5-1b3e518cb5f1']
87.254.192.34 - - [26/Mar/2019 16:44:46] "POST /maintenance HTTP/1.1" 200 -
2019-03-26 16:45:08,677 sample.py 164 INFO   sample app manager received data = {'metadata': {'openstack_version': 'Rocky'}, 'state': 'MAINTENANCE_COMPLETE', 'reply_url': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'project_id': 'e2b561e912b9452b8f992eff81ad9a76', 'instance_ids': 'http://127.0.0.1:12347/v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a/e2b561e912b9452b8f992eff81ad9a76', 'actions_at': '2019-03-26T14:44:08', 'service': 'fenix', 'reply_at': '2019-03-26T14:44:08', 'allowed_actions': [], 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
2019-03-26 16:45:08,677 sample.py 170 INFO   sample app manager state: MAINTENANCE_COMPLETE
2019-03-26 16:45:35,517 stack.py 65 INFO   stack doctor_test_maintenance UPDATE_COMPLETE
2019-03-26 16:45:35,603 sample.py 144 INFO   scaled insances from 4 to 6
2019-03-26 16:45:35,603 sample.py 240 INFO   sample app manager reply: {'state': 'ACK_MAINTENANCE_COMPLETE', 'session_id': '854ba5f0-4fd4-11e9-aeec-54ab3a13729a'}
87.254.192.34 - - [26/Mar/2019 16:45:35] "POST /maintenance HTTP/1.1" 200 -
2019-03-26 16:45:42,971 main.py 121 ERROR  doctor maintenance test failed, Exception=HTTPConnectionPool(host='0.0.0.0', port=12347): Max retries exceeded with url: /v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f8aa17bc240>: Failed to establish a new connection: [Errno 111] Connection refused',))
2019-03-26 16:45:42,977 main.py 122 ERROR  Traceback (most recent call last):
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/urllib3/connection.py", line 141, in _new_conn
    (self.host, self.port), self.timeout, **extra_kw)
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/urllib3/util/connection.py", line 83, in create_connection
    raise err
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/urllib3/util/connection.py", line 73, in create_connection
    sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/urllib3/connectionpool.py", line 601, in urlopen
    chunked=chunked)
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/urllib3/connectionpool.py", line 357, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/usr/lib64/python3.4/http/client.py", line 1137, in request
    self._send_request(method, url, body, headers)
  File "/usr/lib64/python3.4/http/client.py", line 1182, in _send_request
    self.endheaders(body)
  File "/usr/lib64/python3.4/http/client.py", line 1133, in endheaders
    self._send_output(message_body)
  File "/usr/lib64/python3.4/http/client.py", line 963, in _send_output
    self.send(msg)
  File "/usr/lib64/python3.4/http/client.py", line 898, in send
    self.connect()
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/urllib3/connection.py", line 166, in connect
    conn = self._new_conn()
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/urllib3/connection.py", line 150, in _new_conn
    self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f8aa17bc240>: Failed to establish a new connection: [Errno 111] Connection refused

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/requests/adapters.py", line 440, in send
    timeout=timeout
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/urllib3/connectionpool.py", line 639, in urlopen
    _stacktrace=sys.exc_info()[2])
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/urllib3/util/retry.py", line 388, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='0.0.0.0', port=12347): Max retries exceeded with url: /v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f8aa17bc240>: Failed to establish a new connection: [Errno 111] Connection refused',))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/doctor_tests/main.py", line 116, in test_maintenance
    maintenance.wait_maintenance_complete(session_id)
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/doctor_tests/scenario/maintenance.py", line 220, in wait_maintenance_complete
    state = self.get_maintenance_state(session_id)
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/doctor_tests/scenario/maintenance.py", line 208, in get_maintenance_state
    ret = requests.get(url, data=None, headers=headers)
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/requests/api.py", line 72, in get
    return request('get', url, params=params, **kwargs)
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/requests/api.py", line 58, in request
    return session.request(method=method, url=url, **kwargs)
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/requests/sessions.py", line 502, in request
    resp = self.send(prep, **send_kwargs)
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/requests/sessions.py", line 612, in send
    r = adapter.send(request, **kwargs)
  File "/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/requests/adapters.py", line 504, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='0.0.0.0', port=12347): Max retries exceeded with url: /v1/maintenance/854ba5f0-4fd4-11e9-aeec-54ab3a13729a (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f8aa17bc240>: Failed to establish a new connection: [Errno 111] Connection refused',))

2019-03-26 16:45:42,977 sample.py 36 INFO   sample app manager stop......
2019-03-26 16:45:42,982 sample.py 247 INFO   shutdown app manager server at 1553611542.9827988
87.254.192.34 - - [26/Mar/2019 16:45:42] "POST /shutdown HTTP/1.1" 200 -
2019-03-26 16:45:42,986 sample.py 91 INFO   sample inspector stop......
2019-03-26 16:45:42,990 sample.py 253 INFO   shutdown inspector app server at 1553611542.9907777
87.254.192.34 - - [26/Mar/2019 16:45:42] "POST /events/shutdown HTTP/1.1" 200 -
2019-03-26 16:45:42,993 maintenance.py 236 INFO   stack delete start.......
2019-03-26 16:45:56,745 stack.py 65 INFO   stack doctor_test_maintenance DELETE_COMPLETE
2019-03-26 16:45:56,745 apex.py 166 INFO   restore apply patches start......
2019-03-26 16:45:56,826 image.py 71 INFO   image delete start.......
2019-03-26 16:45:57,221 base.py 218 INFO   Command sudo python restore_config.py output ['restore', 'restore: /var/lib/config-data/puppet-generated/ceilometer/etc/ceilometer/event_definitions.yaml', 'Bak_file empty, so removing also: /var/lib/config-data/puppet-generated/ceilometer/etc/ceilometer/event_definitions.yaml']
2019-03-26 16:45:57,223 base.py 218 INFO   Command sudo python restore_compute_config.py output ['restoring nova.bak.']
2019-03-26 16:45:57,227 base.py 218 INFO   Command sudo python restore_compute_config.py output ['nova.bak does not exist.']
2019-03-26 16:45:58,822 image.py 76 INFO   image delete end.......
2019-03-26 16:45:58,823 user.py 163 INFO   user delete start......
2019-03-26 16:45:58,823 user.py 156 INFO   restore default quota......
2019-03-26 16:45:59,302 base.py 218 INFO   Command sudo python restore_aodh.py output []
2019-03-26 16:45:59,424 base.py 218 INFO   Command sudo python restore_compute_config.py output ['nova.bak does not exist.']
2019-03-26 16:46:00,860 user.py 187 INFO   user delete end......
ERROR: InvocationError for command '/home/jenkins/opnfv_slave_root/workspace/doctor-verify-maintenance-apex-sample-x86_64-master/.tox/py34/bin/doctor-test' (exited with code 1)
___________________________________ summary ____________________________________
ERROR:   py34: commands failed
Build step 'Execute shell' marked build as failure
$ ssh-agent -k
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 38214 killed;
[ssh-agent] Stopped.
Archiving artifacts
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Request made to compress build log
Sending email to: tbramwell@linuxfoundation.org agardner@linuxfoundation.org rgrigar@linuxfoundation.org
[WS-CLEANUP] Deleting project workspace...
[WS-CLEANUP] Deferred wipeout is used...
Cannot delete workspace: java.lang.NoClassDefFoundError: Could not initialize class sun.nio.fs.UnixCopyFile
Option not to fail the build is turned on, so let's continue
Finished: FAILURE