promptflow
Eval Code Quality
A example flow defined using class based entry which leverages model config to evaluate the quality of code snippet.
Prerequisites
Install promptflow sdk and other dependencies:
pip install -r requirements.txt
Run flow
-
Prepare your Azure Open AI resource follow this instruction and get your
api_key
if you don't have one. -
Setup connection
Go to "Prompt flow" "Connections" tab. Click on "Create" button, select one of LLM tool supported connection types and fill in the configurations.
Or use CLI to create connection:
# Override keys with --set to avoid yaml file changespf connection create --file ../../connections/azure_openai.yml --set api_key=<your_api_key> api_base=<your_api_base> --name open_ai_connection
Note in flow.flex.yaml we are using connection named open_ai_connection
.
# show registered connectionpf connection show --name open_ai_connection
- Run as normal Python file
python code_quality.py
- Test flow
# correctpf flow test --flow . --inputs code='print(\"Hello, world!\")' --init init.json
# incorrectpf flow test --flow . --inputs code='printf("Hello, world!")' --init init.json
- Create run with multiple lines data
pf run create --flow . --init init.json --data ./data.jsonl --stream
Reference here for default behavior when column-mapping
not provided in CLI.
- List and show run meta
# list created runpf run list
# get a sample run name
name=$(pf run list -r 10 | jq '.[] | select(.name | contains("eval_code_quality_")) | .name'| head -n 1 | tr -d '"')# show specific run detailpf run show --name $name
# show outputpf run show-details --name $name
# show metricspf run show-metrics --name $name
# visualize run in browserpf run visualize --name $name
Run flow in cloud
- Assume we already have a connection named
open_ai_connection
in workspace.
# set default workspaceaz account set -s <your_subscription_id>az configure --defaults group=<your_resource_group_name> workspace=<your_workspace_name>
- Create run
# run with environment variable reference connection in azureml workspacepfazure run create --flow . --init init.json --data ./data.jsonl --stream