PolicyEngine / policyengine-app

PolicyEngine's free web app for computing the impact of public policy.
GNU Affero General Public License v3.0
32 stars 86 forks source link

AI prompt doesn't load #1703

Closed MaxGhenis closed 1 week ago

MaxGhenis commented 2 weeks ago

Example - this shows even when the microsim completes

https://policyengine.org/uk/policy?focus=policyOutput.analysis&reform=3177&region=uk&timePeriod=2024&baseline=1

image
czhou578 commented 1 week ago

@anth-volk It seems like in Analysis.jsx, we are not using the openai package at all. I can make the prompt show but the data is hard coded. How is the openai api supposed to be used with regards to this component?

anth-volk commented 1 week ago

@czhou578 Thanks for bringing this up. We're actually going to switch over to using a competitor, Claude (per this discussion), so could you hold off until then? We were looking to have Claude implemented by Friday, as it's a high-priority fix, and if you'd like to take that on (or do part and have assistance), you could do so once we have a subscription, which should be by 5PM Eastern US today, and open an issue. Are you interested at all?

czhou578 commented 1 week ago

Sure. I'll do a part with assistance then. Thanks for letting me know!

MaxGhenis commented 1 week ago

I think the prompt can be fixed independently of the Claude switch. Viewing the prompt occurs before the LLM query.

anth-volk commented 1 week ago

@czhou578 In that case, let me look into it.

anth-volk commented 1 week ago

I now better understand the issue at hand here. The prompt itself is hard-coded, which is correct. The API itself is what interacts with OpenAI, so in the code, where you find the API call to /analysis, this generates the output. The problem identified in this issue is that the prompt itself isn't output into the CodeEditor block.

czhou578 commented 1 week ago

Makes sense. I'll just make it so that the hard coded prompt shows up in the CodeBlock.jsx component.