I reproduced this a bit more simply by rescoping a token from unscoped to project-scoped using v3:
======================================================================
ERROR: test_token_rescoping (test_exercises.TestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/dolph/dolph/keystone-deploy/test_exercises.py", line 192, in test_token_rescoping
auth_url=KEYSTONE_ENDPOINT + 'v3')
File "/home/dolph/Environments/ansible/local/lib/python2.7/site-packages/keystoneclient/v3/client.py", l
ine 196, in __init__
self.authenticate()
File "/home/dolph/Environments/ansible/local/lib/python2.7/site-packages/keystoneclient/utils.py", line
318, in inner
return func(*args, **kwargs)
File "/home/dolph/Environments/ansible/local/lib/python2.7/site-packages/keystoneclient/httpclient.py",
line 503, in authenticate [293/1965]
resp = self.get_raw_token_from_identity_service(**kwargs)
File "/home/dolph/Environments/ansible/local/lib/python2.7/site-packages/keystoneclient/v3/client.py", l
ine 281, in get_raw_token_from_identity_service
_('Authorization failed: %s') % e)
AuthorizationFailure: Authorization failed: token must be bytes. (HTTP 400)
The backtrace can be reproduced in isolation with just:
$ python -c "from cryptography.fernet import Fernet; f = Fernet(Fernet.generate_key()).decrypt(u'asdf')"
I think the cause is that tokens which pass through json.loads() are read in as unicode, and pypi/cryptography simply refuses to operate on them. It should be safe to cast them to bytes()
I reproduced this a bit more simply by rescoping a token from unscoped to project-scoped using v3:
======= ======= ======= ======= ======= ======= ======= ======= ======= ======= rescoping (test_exercises .TestCase) ------- ------- ------- ------- ------- ------- ------- ------- ------- dolph/dolph/ keystone- deploy/ test_exercises. py", line 192, in test_token_ rescoping url=KEYSTONE_ ENDPOINT + 'v3') dolph/Environme nts/ansible/ local/lib/ python2. 7/site- packages/ keystoneclient/ v3/client. py", l authenticate( ) dolph/Environme nts/ansible/ local/lib/ python2. 7/site- packages/ keystoneclient/ utils.py" , line dolph/Environme nts/ansible/ local/lib/ python2. 7/site- packages/ keystoneclient/ httpclient. py", raw_token_ from_identity_ service( **kwargs) dolph/Environme nts/ansible/ local/lib/ python2. 7/site- packages/ keystoneclient/ v3/client. py", l token_from_ identity_ service Authorization failed: %s') % e) ilure: Authorization failed: token must be bytes. (HTTP 400)
ERROR: test_token_
-------
Traceback (most recent call last):
File "/home/
auth_
File "/home/
ine 196, in __init__
self.
File "/home/
318, in inner
return func(*args, **kwargs)
File "/home/
line 503, in authenticate [293/1965]
resp = self.get_
File "/home/
ine 281, in get_raw_
_('
AuthorizationFa
The backtrace can be reproduced in isolation with just:
$ python -c "from cryptography.fernet import Fernet; f = Fernet( Fernet. generate_ key()). decrypt( u'asdf' )"
I think the cause is that tokens which pass through json.loads() are read in as unicode, and pypi/cryptography simply refuses to operate on them. It should be safe to cast them to bytes()