I found out that os.path.getsize in python doesn't give me the correct size of large files. Smaller files (I tried a 700mb) works fine but if I run it on larger files (+4GB) I get 394 005 664 bytes which is far from correct. If I run the command "ls -lh" in Android on the same large file I get roughly 375mb. Whats wrong? It's obviously having a hard time handling the large files but is there any workaround for this?
Edit: If I run ls -lsh I get 4579076, is that occupied space in kb?
Edit: stat -c %s gives me 4688972960, and that is correct. But in Python os.stat('path').st_size gives me 394005664
Edit: I am going all in testing now
f = open("filepath") old_file_position = f.tell() f.seek(0, os.SEEK_END) size = f.tell() f.seek(old_file_position, os.SEEK_SET)
Edit: So guess I have to use stat -c %s for now, but I can't understand why os.path.getsize and ls -lh is giving me these values