mirror of https://github.com/knrd1/chatgpt.git
image generation support
This commit is contained in:
parent
1a285bd24b
commit
9823436eb7
14
README.md
14
README.md
|
@ -1,5 +1,5 @@
|
|||
# ChatGPT
|
||||
ChatGPT is a simple IRC bot written in Python. It connects to OpenAI endpoints to answer questions.
|
||||
ChatGPT is a simple IRC bot written in Python. It connects to OpenAI endpoints to answer questions or generate images.
|
||||
|
||||
ChatGPT uses official bindings from OpenAI to interact with the API through HTTP requests:
|
||||
https://platform.openai.com/docs/api-reference
|
||||
|
@ -13,6 +13,7 @@ Install python3 and the official Python bindings:
|
|||
$ apt install python3 python3-pip (Debian/Ubuntu)
|
||||
$ yum install python3 python3-pip (RedHat/CentOS)
|
||||
$ pip3 install openai
|
||||
$ pip3 install pyshorteners
|
||||
$ git clone https://github.com/knrd1/chatgpt.git
|
||||
$ cd chatgpt
|
||||
$ cp example-chat.conf chat.conf
|
||||
|
@ -56,10 +57,15 @@ ChatGPT will interact only if you mention its nickname:
|
|||
10:35:56 <@knrd1> ChatGPT: do you like IRC?
|
||||
10:35:59 < ChatGPT> Yes, I like IRC. It is a great way to communicate with people from around the world.
|
||||
|
||||
```
|
||||
If you set the model to "dalle", the ChatGPT IRC bot will return a shortened URL to the generated image:
|
||||
```
|
||||
17:33:16 <@knrd1> ChatGPT: two horses dancing on the street
|
||||
17:33:23 < ChatGPT> https://tinyurl.com/2hr5uf4w
|
||||
```
|
||||
### Model endpoint compatibility
|
||||
|
||||
ChatGPT IRC bot can use two API endpoints: /v1/chat/completions and /v1/completions
|
||||
ChatGPT IRC bot can use three API endpoints.
|
||||
|
||||
Following models support endpoint /v1/chat/completions:
|
||||
```
|
||||
|
@ -69,4 +75,8 @@ Models that support /v1/completions:
|
|||
```
|
||||
text-davinci-003, text-davinci-002, text-curie-001, text-babbage-001, text-ada-001, davinci, curie, babbage, ada
|
||||
```
|
||||
Use the "dalle" model to generate the image:
|
||||
```
|
||||
dalle
|
||||
```
|
||||
More details about models: https://platform.openai.com/docs/models
|
||||
|
|
16
chatgpt.py
16
chatgpt.py
|
@ -3,6 +3,7 @@ import socket
|
|||
import ssl
|
||||
import time
|
||||
import configparser
|
||||
import pyshorteners
|
||||
from typing import Union, Tuple
|
||||
|
||||
# Read configuration from file
|
||||
|
@ -139,6 +140,21 @@ while True:
|
|||
except Exception as e:
|
||||
print("Error: " + str(e))
|
||||
irc.send(bytes("PRIVMSG " + channel + " :API call failed. Try again later.\n", "UTF-8"))
|
||||
elif model in ["dalle"]:
|
||||
try:
|
||||
response = openai.Image.create(
|
||||
prompt="Q: " + question + "\nA:",
|
||||
n=1,
|
||||
size="1024x1024"
|
||||
)
|
||||
answers = response.data[0].url
|
||||
long_url = answers
|
||||
type_tiny = pyshorteners.Shortener()
|
||||
short_url = type_tiny.tinyurl.short(long_url)
|
||||
irc.send(bytes("PRIVMSG " + channel + " :" + short_url + "\n", "UTF-8"))
|
||||
except Exception as e:
|
||||
print("Error: " + str(e))
|
||||
irc.send(bytes("PRIVMSG " + channel + " :API call failed. Try again later.\n", "UTF-8"))
|
||||
else:
|
||||
print("Invalid model.")
|
||||
irc.send(bytes("PRIVMSG " + channel + " :Invalid model.\n", "UTF-8"))
|
||||
|
|
Loading…
Reference in New Issue