TypeError: ljust() argument 2 must be char, not unicode

432 views Asked by At

I'm getting the following error when using the s2sphere library in python2.7 at this location.

ljust() argument 2 must be char, not unicode

The method is link to the file on GH:

    @classmethod
    def from_token(cls, token):
        """Creates a CellId from a hex encoded cell id string, called a token.
        :param str token:
            A hex representation of the cell id. If the input is shorter than
            16 characters, zeros are appended on the right.
        """
        return cls(int(token.ljust(16, '0'), 16))

The file appears to be ASCII-encoded so I'm scratching my head over why I am seeing this

1

There are 1 answers

0
rupello On

The file imports unicode_literals at the top:

from __future__ import print_function, unicode_literals, division

Hence the '0' is a unicode string & the 'token' parameter needs to be unicode to match