Group 3: [Engineering Prompts To Get The Best Output]​
​
Names: Alissa, Emily, Beyonce, Stanley, Lilly, Feixiang
-
Background: We asked ChatGPT questions and changed the wording a few times to see if it could get our questions right. We asked questions relating to science, politics, and our actual school and it provided mostly correct answers, but also had a few major differences in the responses. ​​
-
Prompts: ​
-Where is UConn located?​
-Who are the current presidential candidates?​
-What is mitosis?​
-
Reflection: ChatGPT was asked to summarize the location of UConn. ChatGPT states that UConn was in Storrs. However, ChatGPT didn't know that there were branch campuses of UConn. People should ask with more details. E.g. Where are UConn and its branch campuses located? Also, to get the most accurate information, it's important to ask what date the data it has is from. ChatGPT still thinks Joe Biden is running for president because the data it's trained on is old. Even though it was updated in September, the data is from before that. ChatGPT doesn’t have a mind of its own when asked to give their own political evaluation People should check actual news articles and updates to make sure the information is up to date. ChatGPT provided the correct answer for the question "what is mitosis" and did not need many follow up questions. It also provided the many steps within the process. If you ask ChatGPT specific questions it mostly gets them right.​
-
To get the best output, you sometimes might need to fill in information that ChatGPT doesn't have. I told ChatGPT that Kamala Harris was Joe Biden's replacement to make it give the correct answer. However, it has no way of fact checking that information so even if you give it false information, it will assume it as true. For example, I told ChatGPT that Mr. Beast was running for president and as a result it added Mr. Beast into the list of contenders.​