While the OpenAI detector has been useful in identifying content that is created by ChatGPT and other OpenAI-based models, as usage increases (especially by users here on Stack Exchange sites), it's been down more and more frequently.
After installing it locally per the project README, I receive the following error when attempting to run it from the repo directory using python -m detector.server ../gpt-2-models/detector-base.pt
:
Traceback (most recent call last):
File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/home/ntd/src/gpt-2-output-dataset/detector/server.py", line 120, in <module>
fire.Fire(main)
File "/home/ntd/src/venv/openai-detector/lib/python3.10/site-packages/fire/core.py", line 141, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/home/ntd/src/venv/openai-detector/lib/python3.10/site-packages/fire/core.py", line 475, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/home/ntd/src/venv/openai-detector/lib/python3.10/site-packages/fire/core.py", line 691, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "/home/ntd/src/gpt-2-output-dataset/detector/server.py", line 89, in main
model.load_state_dict(data['model_state_dict'])
File "/home/ntd/src/venv/openai-detector/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1671, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for RobertaForSequenceClassification:
Missing key(s) in state_dict: "roberta.embeddings.position_ids".
Unexpected key(s) in state_dict: "roberta.pooler.dense.weight", "roberta.pooler.dense.bias".
I attempted to change transformers==2.9.1
per comments in this issue, but then pip install -r requirements.txt
fails as well.