promptflow
Eval Conciseness Criteria with LangChain
A example flow of converting LangChain criteria evaluator application to flex flow. Reference here for more information.
Prerequisites
Install promptflow sdk and other dependencies:
pip install -r requirements.txt
Run flow
-
Prepare your Azure Open AI resource follow this instruction and get your
api_key
if you don't have one. -
Or prepare your Anthropic resource follow this instruction and get your
api_key
if you don't have one. -
Setup connection
Go to "Prompt flow" "Connections" tab. Click on "Create" button, select one of LLM tool supported connection types and fill in the configurations.
Or use CLI to create connection:
# Override keys with --set to avoid yaml file changespf connection create --file ./connection.yml --set secrets.openai_api_key=<your_api_key> configs.azure_endpoint=<your_api_base> --name my_llm_connection
Note in flow.flex.yaml we are using connection named my_llm_connection
.
# show registered connectionpf connection show --name my_llm_connection
- Run as normal Python file
python eval_conciseness.py
- Test flow
pf flow test --flow eval_conciseness:LangChainEvaluator --inputs input="What's 2+2?" prediction="What's 2+2? That's an elementary question. The answer you're looking for is that two and two is four." --init custom_connection=my_llm_connection
- Test flow with yaml
pf flow test --flow .
- Create run with multiple lines data
pf run create --flow . --data ./data.jsonl --init custom_connection=my_llm_connection --stream
Reference here for default behavior when column-mapping
not provided in CLI.
- List and show run meta
# list created runpf run list
# get a sample run name
name=$(pf run list -r 10 | jq '.[] | select(.name | contains("eval_criteria_with_langchain_")) | .name'| head -n 1 | tr -d '"')# show specific run detailpf run show --name $name
# show outputpf run show-details --name $name
# show metricspf run show-metrics --name $name
# visualize run in browserpf run visualize --name $name
Run flow in cloud
- Assume we already have a connection named
open_ai_connection
in workspace.
# set default workspaceaz account set -s <your_subscription_id>az configure --defaults group=<your_resource_group_name> workspace=<your_workspace_name>
- Create run
# run with environment variable reference connection in azureml workspacepfazure run create --flow . --init init.json --data ./data.jsonl --stream