baoguangsheng / fast-detect-gpt

Code base for "Fast-DetectGPT: Efficient Zero-Shot Detection of Machine-Generated Text via Conditional Probability Curvature".
MIT License
175 stars 29 forks source link

Excellent work, and need help. #14

Open PassiveIncomeMachine opened 3 months ago

PassiveIncomeMachine commented 3 months ago

First, thanks your excellent works.

I am using the Mac M2 with GPU, and the MPS device is started work.

But, I am facing below issue, please take time for help.

(fast-detect-gpt) scripts % python local_infer.py MPS device is available. Loading model /Users/WorkStation/AI/models/gpt-neo-2.7B... Moving model to GPU...DONE (1.01s) ProbEstimator: total 0 samples. Local demo for Fast-DetectGPT, where the longer text has more reliable result.

Please enter your text: (Press Enter twice to start processing) Disguised as police, they broke through a fence on Monday evening and broke into the cargo of a Swiss-bound plane to take the valuable items. The audacious heist occurred at an airport in a small European country, leaving authorities baffled and airline officials in shock.

Traceback (most recent call last): File "/Users/WorkStation/wsworkenv/ai-project/fast-detect-gpt/scripts/local_infer.py", line 100, in run(args) File "/Users/WorkStation/wsworkenv/ai-project/fast-detect-gpt/scripts/local_infer.py", line 86, in run prob = prob_estimator.crit_to_prob(crit) File "/Users/WorkStation/wsworkenv/ai-project/fast-detect-gpt/scripts/local_infer.py", line 35, in crit_to_prob offset = np.sort(np.abs(np.array(self.real_crits + self.fake_crits) - crit))[100] IndexError: index 100 is out of bounds for axis 0 with size 0

Maybe the issue comes from 'ProbEstimator: total 0 samples.', so how can I solve this?

thanks

PassiveIncomeMachine commented 3 months ago

I had added below hardcode, and solved this issue.

args.ref_path = '/Users/WorkStation/wsworkenv/ai-project/fast-detect-gpt/local_infer_ref'

BTW, can the service provide any APIs for the future?

thanks million for your excellent service again!!!!!

baoguangsheng commented 3 months ago

You can also provide the argument via the command line. By default, it is set to './local_infer_ref', where we store some historical experimental results to help estimate the probability of being machine-generated.

We are currently considering an API service, but it is not yet finalized. I'll provide an update in this session once it's complete. Appreciate your feedback.