error UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff in position 0: invalid start byte

https://github.com/affinelayer/pix2pix-tensorflow/tree/master/tools

An error occurred when compiling "process.py" on the above site.

     python tools/process.py --input_dir data --            operation resize --outp
    ut_dir data2/resize
    data/0.jpg -> data2/resize/0.png

Traceback (most recent call last):

    File "tools/process.py", line 235, in <module>
      main()
    File "tools/process.py", line 167, in main
      src = load(src_path)
    File "tools/process.py", line 113, in load
      contents = open(path).read()
          File"/home/user/anaconda3/envs/tensorflow_2/lib/python3.5/codecs.py", line 321, in decode
      (result, consumed) = self._buffer_decode(data, self.errors, final)
    UnicodeDecodeError: 'utf-8' codec can't decode     byte 0xff in position 0: invalid start byte

What is the cause of the error? Python's version is 3.5.2.

Python tries to convert a byte-array (a bytes which it assumes to be a utf-8-encoded string) to a unicode string (str). This process of course is a decoding according to utf-8 rules. When it tries this, it encounters a byte sequence which is not allowed in utf-8-encoded strings (namely this 0xff at position 0).

Since you did not provide any code we could look at, we only could guess on the rest.

From the stack trace we can assume that the triggering action was the reading from a file (contents = open(path).read()). I propose to recode this in a fashion like this:

    with open(path, 'rb') as f:
      contents = f.read()

That b in the mode specifier in the open() states that the file shall be treated as binary, so contents will remain a bytes. No decoding attempt will happen this way.

From: stackoverflow.com/q/42339876

Back to homepage or read more recommendations: