给来一个示例代码
时间: 2024-01-26 19:03:35 浏览: 30
好的,这里是一个简单的示例代码,可以使用ChitGPT进行简单的问答对话:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neo-125M")
model = AutoModelForCausalLM.from_pretrained("EleutherAI/gpt-neo-125M")
def generate_response(input_str, chat_history=""):
# Encode the input and append to the chat history
input_ids = tokenizer.encode(input_str + chat_history, return_tensors='pt')
# Generate response
output = model.generate(
input_ids,
max_length=50,
temperature=0.7,
do_sample=True,
top_k=50,
top_p=0.95,
repetition_penalty=1.2,
pad_token_id=tokenizer.eos_token_id
)
# Decode response and remove input from it
generated_text = tokenizer.decode(output[0], skip_special_tokens=True)
generated_text = generated_text[len(input_str):].strip()
return generated_text
# Example conversation
chat_history = ""
while True:
user_input = input("You: ")
chat_history = generate_response(user_input, chat_history)
print("Bot: " + chat_history)
```
这个例子将使用ChitGPT进行对话,模型使用EleutherAI/gpt-neo-125M。代码中的`generate_response`函数使用输入和历史对话来生成回复,然后将回复返回给主函数进行打印。