parsaalian / webform-testing

Source code for "Semantic Constraint Inference for Web Form Test Generation".
2 stars 0 forks source link

Add Form Context to the Prompts #4

Closed parsaalian closed 1 year ago

parsaalian commented 1 year ago

To know what the form is about in the first place, we can use aggregate all the labels in the form, plus the global feedback, and use it in the prompt.

parsaalian commented 1 year ago

Added labels in the prompt as the form context.

Changes in system prompt for constraint generation:

The information that user has provided is as follows:
- All the labels in the form, so that you would have a context of what the form is about.
- The input and the textual information associated with it, such as labels, hint texts, or inline feedback.
- The input fields that are relevant to the input field in question.
- Previously generated constraints for the input field in question if available.
- Tested values and the inline feedback for the input field in question if available.
- Test values and the global feedback for the whole form if available.

Changes in user prompt for constraint generation:

The labels for the form are:
{form_context}
These labels are used to provide context for the functionality of the form.

We are generating constraints for the following input field:
{create_field_info_text(input_group)}

The relevant information available in the form are (in order of relevance):
{create_relevant_info_text(input_group, relevant_count)}

{"The previously generated constraints are:" if generated_constraint_string is not None else ""}
{generated_constraint_string if generated_constraint_string is not None else ""}

{"We have tried to generate constraints for this field before." if last_try is not None else ""}
{last_try["value"] if last_try is not None else ""}
{"And got the following inline and global feedback:" if last_try is not None else ""}
{last_try["feedback"] if last_try is not None else ""}
nashid commented 1 year ago

The prompt appears to be quite verbose, taking up approximately 550 tokens. I assume this is intentional.