tokenize.py uses N_TOKENS without prior definition
Bug #127023 reported by
Scott Kitterman
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
python2.5 (Ubuntu) |
Invalid
|
Undecided
|
Unassigned |
Bug Description
Binary package hint: apport
Doing a test dist-upgrade using update-manager and update-manager crashed. Apport did not survive the attempt to collect the crash information. This is a current Feisty Kubuntu
PythonArgs: ['/usr/
Traceback:
Traceback (most recent call last):
File "/usr/share/
import signal, inspect, atexit, grp
File "inspect.py", line 31, in <module>
import sys, os, types, string, re, dis, imp, tokenize, linecache
File "tokenize.py", line 38, in <module>
COMMENT = N_TOKENS
NameError: name 'N_TOKENS' is not defined
Changed in apport: | |
status: | Incomplete → Confirmed |
To post a comment you must log in.
Apport crash report on apport crash