FailedConsole Output

Started by upstream project "doctor-verify-master" build number 90
originally caused by:
 Triggered by Gerrit: https://gerrit.opnfv.org/gerrit/64103
[EnvInject] - Loading node environment variables.
Building remotely on zte-virtual3 (doctor-apex-x86_64) in workspace /home/jenkins/opnfv/slave_root/workspace/doctor-verify-fault_management-apex-sample-x86_64-master
[ssh-agent] Looking for ssh-agent implementation...
[ssh-agent]   Exec ssh-agent (binary ssh-agent on a remote machine)
$ ssh-agent
SSH_AUTH_SOCK=/tmp/ssh-jAF9b878mHfj/agent.5455
SSH_AGENT_PID=5457
[ssh-agent] Started.
Running ssh-add (command line suppressed)
Identity added: /home/jenkins/opnfv/slave_root/workspace/doctor-verify-fault_management-apex-sample-x86_64-master@tmp/private_key_808825544710919463.key (/home/jenkins/opnfv/slave_root/workspace/doctor-verify-fault_management-apex-sample-x86_64-master@tmp/private_key_808825544710919463.key)
[ssh-agent] Using credentials jenkins-ci (Jenkins Master SSH)
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://gerrit.opnfv.org/gerrit/doctor
 > git init /home/jenkins/opnfv/slave_root/workspace/doctor-verify-fault_management-apex-sample-x86_64-master # timeout=10
Fetching upstream changes from https://gerrit.opnfv.org/gerrit/doctor
 > git --version # timeout=10
 > git fetch --tags --progress https://gerrit.opnfv.org/gerrit/doctor +refs/heads/*:refs/remotes/origin/* # timeout=15
 > git config remote.origin.url https://gerrit.opnfv.org/gerrit/doctor # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://gerrit.opnfv.org/gerrit/doctor # timeout=10
Fetching upstream changes from https://gerrit.opnfv.org/gerrit/doctor
 > git fetch --tags --progress https://gerrit.opnfv.org/gerrit/doctor refs/changes/03/64103/1 # timeout=15
Checking out Revision e054d1f5111f8ea735fe59d9cc44c8c0b393ea00 (refs/changes/03/64103/1)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e054d1f5111f8ea735fe59d9cc44c8c0b393ea00 # timeout=15
Commit message: "fix the configparser for  Python 2 and 3 Compatibility"
 > git rev-parse FETCH_HEAD^{commit} # timeout=10
 > git rev-list --no-walk baac6579556f8216b36db0d0f87f9c2d4f8b4ef5 # timeout=10
No emails were triggered.
[doctor-verify-fault_management-apex-sample-x86_64-master] $ /usr/bin/env bash /tmp/jenkins7812305556984896297.sh
Gathering IP information for Apex installer VM
 1     undercloud                     running
Installer VM detected
Installer ip is 192.168.122.31
fetch_os_creds.info: Fetching rc file...
fetch_os_creds.info: Verifying connectivity to 192.168.122.31...
fetch_os_creds.info: 192.168.122.31 is reachable!
fetch_os_creds.info: ... from Instack VM 192.168.122.31...
Warning: Permanently added '192.168.122.31' (ECDSA) to the list of known hosts.
-------- Credentials: --------
# Clear any old environment that may conflict.
for key in $( set | awk '{FS="="}  /^OS_/ {print $1}' ); do unset $key ; done
export OS_NO_CACHE=True
export COMPUTE_API_VERSION=1.1
export OS_USERNAME=admin
export no_proxy=,192.168.37.12,192.0.2.3
export OS_USER_DOMAIN_NAME=Default
export OS_VOLUME_API_VERSION=3
export OS_CLOUDNAME=overcloud
export OS_AUTH_URL=http://192.168.37.12:5000
export NOVA_VERSION=1.1
export OS_IMAGE_API_VERSION=2
export OS_PASSWORD=CAhmmwRN8t4JIJ7bhVxqaZRG0
export OS_PROJECT_DOMAIN_NAME=Default
export OS_IDENTITY_API_VERSION=3
export OS_PROJECT_NAME=admin
export OS_AUTH_TYPE=password
export PYTHONWARNINGS="ignore:Certificate has no, ignore:A true SSLContext object is not available"

# Add OS_CLOUDNAME to PS1
if [ -z "${CLOUDPROMPT_ENABLED:-}" ]; then
    export PS1=${PS1:-""}
    export PS1=\${OS_CLOUDNAME:+"(\$OS_CLOUDNAME)"}\ $PS1
    export CLOUDPROMPT_ENABLED=1
fi
export OS_PROJECT_ID=c9cb46561bd446b9abc5e0d222b52e0e
export OS_TENANT_NAME=admin
export OS_REGION_NAME=regionOne
[doctor-verify-fault_management-apex-sample-x86_64-master] $ /bin/sh -xe /tmp/jenkins1716900717010089860.sh
+ source /home/jenkins/opnfv-openrc.sh
+++ set
+++ awk '{FS="="}  /^OS_/ {print $1}'
++ export OS_NO_CACHE=True
++ OS_NO_CACHE=True
++ export COMPUTE_API_VERSION=1.1
++ COMPUTE_API_VERSION=1.1
++ export OS_USERNAME=admin
++ OS_USERNAME=admin
++ export no_proxy=,192.168.37.12,192.0.2.3
++ no_proxy=,192.168.37.12,192.0.2.3
++ export OS_USER_DOMAIN_NAME=Default
++ OS_USER_DOMAIN_NAME=Default
++ export OS_VOLUME_API_VERSION=3
++ OS_VOLUME_API_VERSION=3
++ export OS_CLOUDNAME=overcloud
++ OS_CLOUDNAME=overcloud
++ export OS_AUTH_URL=http://192.168.37.12:5000
++ OS_AUTH_URL=http://192.168.37.12:5000
++ export NOVA_VERSION=1.1
++ NOVA_VERSION=1.1
++ export OS_IMAGE_API_VERSION=2
++ OS_IMAGE_API_VERSION=2
++ export OS_PASSWORD=CAhmmwRN8t4JIJ7bhVxqaZRG0
++ OS_PASSWORD=CAhmmwRN8t4JIJ7bhVxqaZRG0
++ export OS_PROJECT_DOMAIN_NAME=Default
++ OS_PROJECT_DOMAIN_NAME=Default
++ export OS_IDENTITY_API_VERSION=3
++ OS_IDENTITY_API_VERSION=3
++ export OS_PROJECT_NAME=admin
++ OS_PROJECT_NAME=admin
++ export OS_AUTH_TYPE=password
++ OS_AUTH_TYPE=password
++ export 'PYTHONWARNINGS=ignore:Certificate has no, ignore:A true SSLContext object is not available'
++ PYTHONWARNINGS='ignore:Certificate has no, ignore:A true SSLContext object is not available'
++ '[' -z '' ']'
++ export PS1=
++ PS1=
++ export 'PS1=${OS_CLOUDNAME:+($OS_CLOUDNAME)} '
++ PS1='${OS_CLOUDNAME:+($OS_CLOUDNAME)} '
++ export CLOUDPROMPT_ENABLED=1
++ CLOUDPROMPT_ENABLED=1
++ export OS_PROJECT_ID=c9cb46561bd446b9abc5e0d222b52e0e
++ OS_PROJECT_ID=c9cb46561bd446b9abc5e0d222b52e0e
++ export OS_TENANT_NAME=admin
++ OS_TENANT_NAME=admin
++ export OS_REGION_NAME=regionOne
++ OS_REGION_NAME=regionOne
+ '[' -f /home/jenkins/os_cacert ']'
+ source /home/jenkins/opnfv-installer.sh
++ export INSTALLER_TYPE=apex
++ INSTALLER_TYPE=apex
++ export INSTALLER_IP=192.168.122.31
++ INSTALLER_IP=192.168.122.31
++ export SSH_KEY=/home/jenkins/installer_key_file
++ SSH_KEY=/home/jenkins/installer_key_file
+ sudo -E tox -e py34
py34 create: /home/jenkins/opnfv/slave_root/workspace/doctor-verify-fault_management-apex-sample-x86_64-master/.tox/py34
py34 installdeps: -r/home/jenkins/opnfv/slave_root/workspace/doctor-verify-fault_management-apex-sample-x86_64-master/requirements.txt
py34 develop-inst: /home/jenkins/opnfv/slave_root/workspace/doctor-verify-fault_management-apex-sample-x86_64-master
py34 installed: You are using pip version 9.0.1, however version 18.1 is available.,You should consider upgrading via the 'pip install --upgrade pip' command.,amqp==2.2.1,aodhclient==0.9.0,appdirs==1.4.3,asn1crypto==0.22.0,Babel==2.3.4,bcrypt==3.1.3,cachetools==2.0.0,certifi==2017.4.17,cffi==1.10.0,chardet==3.0.4,click==6.7,cliff==2.8.2,cmd2==0.7.5,contextlib2==0.5.5,cryptography==2.0.2,debtcollector==1.17.1,deprecation==1.0.1,-e git+https://gerrit.opnfv.org/gerrit/doctor@e054d1f5111f8ea735fe59d9cc44c8c0b393ea00#egg=doctor_tests,enum-compat==0.0.2,eventlet==0.20.0,fasteners==0.14.1,flake8==2.5.5,Flask==0.12.2,futurist==1.3.1,greenlet==0.4.12,idna==2.5,iso8601==0.1.11,itsdangerous==0.24,Jinja2==2.9.6,jsonpatch==1.16,jsonpointer==1.10,jsonschema==2.6.0,keystoneauth1==3.1.0,kombu==4.1.0,MarkupSafe==1.0,mccabe==0.4.0,monotonic==1.3,msgpack-python==0.4.8,netaddr==0.7.19,netifaces==0.10.6,openstacksdk==0.9.17,os-client-config==1.28.0,osc-lib==1.7.0,oslo.concurrency==3.21.1,oslo.config==4.11.1,oslo.context==2.17.1,oslo.i18n==3.17.1,oslo.log==3.30.2,oslo.messaging==5.30.6,oslo.middleware==3.30.1,oslo.serialization==2.20.2,oslo.service==1.25.1,oslo.utils==3.28.3,oslo.versionedobjects==1.26.2,paramiko==2.2.1,Paste==2.0.3,PasteDeploy==1.5.2,pbr==3.1.1,pep8==1.7.1,pika==0.10.0,pika-pool==0.1.3,positional==1.1.2,prettytable==0.7.2,pyasn1==0.3.1,pycparser==2.18,pyflakes==1.0.0,pyinotify==0.9.6,PyNaCl==1.1.2,pyOpenSSL==17.2.0,pyparsing==2.2.0,pyperclip==1.5.27,python-ceilometerclient==2.9.0,python-cinderclient==3.1.0,python-congressclient==1.8.0,python-dateutil==2.6.1,python-glanceclient==2.8.0,python-heatclient==1.11.1,python-keystoneclient==3.13.0,python-neutronclient==6.5.0,python-novaclient==9.1.2,python-openstackclient==3.12.1,python-swiftclient==3.4.0,python-vitrageclient==1.4.0,pytz==2017.2,PyYAML==3.12,repoze.lru==0.6,requests==2.18.2,requestsexceptions==1.3.0,rfc3986==1.1.0,Routes==2.4.1,scp==0.10.2,simplejson==3.11.1,six==1.10.0,statsd==3.2.1,stevedore==1.25.1,tenacity==4.4.0,urllib3==1.22,vine==1.1.4,virtualenv==15.1.0,warlock==1.2.0,WebOb==1.7.3,Werkzeug==0.12.2,wrapt==1.10.10
py34 runtests: PYTHONHASHSEED='154939898'
py34 runtests: commands[0] | doctor-test
/home/jenkins/opnfv/slave_root/workspace/doctor-verify-fault_management-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/paramiko/client.py:711: UserWarning: Unknown ssh-ed25519 host key for 192.168.122.31: b'4661c7c49d01d0948d60eebb60cbb8c0'
  key.get_fingerprint())))
2018-10-27 10:41:18,912 main.py 128 INFO   doctor test starting.......
2018-10-27 10:41:18,912 apex.py 43 INFO   Setup Apex installer start......
2018-10-27 10:41:18,913 base.py 92 INFO   Get SSH keys from apex installer......
2018-10-27 10:41:19,269 apex.py 60 INFO   Get overcloud config details from Apex installer......
2018-10-27 10:41:19,270 base.py 110 INFO   Run command=source stackrc; nova list | grep ' overcloud-' in apex installer......
2018-10-27 10:41:24,770 base.py 119 INFO   Output=['| 7f49a6f0-a75f-43e5-a469-61ef470eb56d | overcloud-controller-0  | ACTIVE | -          | Running     | ctlplane=192.0.2.10 |', '| be97f553-86ec-4ae1-b4d8-02728d8fff09 | overcloud-novacompute-0 | ACTIVE | -          | Running     | ctlplane=192.0.2.8  |'] command=source stackrc; nova list | grep ' overcloud-' in apex installer
2018-10-27 10:41:24,771 base.py 124 INFO   Check command=grep docker /home/stack/deploy_command return in apex installer......
2018-10-27 10:41:24,836 base.py 127 INFO   return 0
2018-10-27 10:41:24,837 apex.py 73 INFO   controller_ips:['192.0.2.10']
2018-10-27 10:41:24,837 apex.py 74 INFO   compute_ips:['192.0.2.8']
2018-10-27 10:41:24,837 apex.py 75 INFO   use_containers:True
2018-10-27 10:41:26,813 apex.py 115 INFO   Set apply patches start......
/home/jenkins/opnfv/slave_root/workspace/doctor-verify-fault_management-apex-sample-x86_64-master/.tox/py34/lib/python3.4/site-packages/paramiko/client.py:711: UserWarning: Unknown ssh-ed25519 host key for 192.0.2.10: b'7a2bc1a11eb5322873693220bd120c6b'
  key.get_fingerprint())))
2018-10-27 10:41:30,525 base.py 61 INFO   Setup ssh stunnel in apex installer......
2018-10-27 10:41:30,525 base.py 74 INFO   tunnel for port 12346
2018-10-27 10:41:30,531 image.py 48 INFO   image create start......
2018-10-27 10:41:38,219 image.py 68 INFO   image create end......
2018-10-27 10:41:38,219 user.py 70 INFO   user create start......
2018-10-27 10:41:38,678 user.py 86 INFO   create project......
2018-10-27 10:41:39,097 user.py 95 INFO   test project <Project description=, domain_id=default, enabled=True, id=4057f8d04dc047e3b22a145151d21bbf, is_domain=False, links={'self': 'http://192.0.2.3:35357/v3/projects/4057f8d04dc047e3b22a145151d21bbf'}, name=doctor, parent_id=default, tags=[]>
2018-10-27 10:41:39,681 user.py 103 INFO   create user......
2018-10-27 10:41:40,317 user.py 113 INFO   test user <User domain_id=default, enabled=True, id=cbedf89090514fdfb054ec4a069cf6d8, links={'self': 'http://192.0.2.3:35357/v3/users/cbedf89090514fdfb054ec4a069cf6d8'}, name=doctor, options={}, password_expires_at=None>
2018-10-27 10:41:40,557 user.py 127 INFO   role _member_ already created......
2018-10-27 10:41:40,557 user.py 128 INFO   test role <Role description=None, domain_id=None, id=fc7ca3eb155e44108b8e35009816350a, links={'self': 'http://192.0.2.3:35357/v3/roles/fc7ca3eb155e44108b8e35009816350a'}, name=_member_>
2018-10-27 10:41:42,486 user.py 78 INFO   user create end......
2018-10-27 10:41:42,487 main.py 55 INFO   doctor fault management test starting.......
2018-10-27 10:41:42,976 fault_management.py 65 INFO   fault management setup......
2018-10-27 10:41:42,976 user.py 190 INFO   quota update start......
2018-10-27 10:41:42,977 user.py 206 INFO   default quota update start......
2018-10-27 10:41:44,447 user.py 217 INFO   user quota update start......
2018-10-27 10:41:45,185 user.py 230 INFO   quota update end......
2018-10-27 10:41:45,186 network.py 41 INFO   network create start.......
2018-10-27 10:41:46,559 main.py 81 ERROR  doctor fault management test failed, Exception=<html><body><h1>503 Service Unavailable</h1>
No server is available to handle this request.
</body></html>

2018-10-27 10:41:46,559 fault_management.py 94 INFO   fault management cleanup......
2018-10-27 10:41:46,560 sample.py 79 INFO   sample inspector stop......
2018-10-27 10:41:46,560 sample.py 35 INFO   sample monitor stop......
2018-10-27 10:41:46,560 sample.py 31 INFO   sample consumer stop......
2018-10-27 10:41:46,560 alarm.py 84 INFO   alarm delete start.......
2018-10-27 10:41:50,776 alarm.py 93 INFO   alarm delete end.......
2018-10-27 10:41:50,777 instance.py 76 INFO   instance delete start.......
2018-10-27 10:41:52,413 instance.py 89 INFO   instance delete end.......
2018-10-27 10:41:52,414 network.py 61 INFO   subnet delete start.......
2018-10-27 10:41:52,414 network.py 64 INFO   subnet delete end.......
2018-10-27 10:41:52,414 network.py 66 INFO   network delete start.......
2018-10-27 10:41:52,415 network.py 69 INFO   network delete end.......
2018-10-27 10:41:52,415 apex.py 169 INFO   restore apply patches start......
2018-10-27 10:42:00,262 image.py 71 INFO   image delete start.......
2018-10-27 10:42:00,749 image.py 76 INFO   image delete end.......
2018-10-27 10:42:00,749 user.py 163 INFO   user delete start......
2018-10-27 10:42:02,557 user.py 187 INFO   user delete end......
ERROR: InvocationError for command '/home/jenkins/opnfv/slave_root/workspace/doctor-verify-fault_management-apex-sample-x86_64-master/.tox/py34/bin/doctor-test' (exited with code 1)
___________________________________ summary ____________________________________
ERROR:   py34: commands failed
Build step 'Execute shell' marked build as failure
$ ssh-agent -k
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 5457 killed;
[ssh-agent] Stopped.
Archiving artifacts
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Request made to compress build log
Sending email to: tbramwell@linuxfoundation.org agardner@linuxfoundation.org rgrigar@linuxfoundation.org
[WS-CLEANUP] Deleting project workspace...[WS-CLEANUP] done
Finished: FAILURE