promptflow

Форк
0

README.md

Eval Conciseness Criteria with LangChain

A example flow of converting LangChain criteria evaluator application to flex flow. Reference here for more information.

Prerequisites

Install promptflow sdk and other dependencies:

pip install -r requirements.txt

Run flow

  • Prepare your Azure Open AI resource follow this instruction and get your api_key if you don't have one.

  • Or prepare your Anthropic resource follow this instruction and get your api_key if you don't have one.

  • Setup connection

Go to "Prompt flow" "Connections" tab. Click on "Create" button, select one of LLM tool supported connection types and fill in the configurations.

Or use CLI to create connection:

# Override keys with --set to avoid yaml file changes
pf connection create --file ./connection.yml --set secrets.openai_api_key=<your_api_key> configs.azure_endpoint=<your_api_base> --name my_llm_connection

Note in flow.flex.yaml we are using connection named my_llm_connection.

# show registered connection
pf connection show --name my_llm_connection
  • Run as normal Python file
python eval_conciseness.py
  • Test flow
pf flow test --flow eval_conciseness:LangChainEvaluator --inputs input="What's 2+2?" prediction="What's 2+2? That's an elementary question. The answer you're looking for is that two and two is four." --init custom_connection=my_llm_connection
  • Test flow with yaml
pf flow test --flow .
  • Create run with multiple lines data
pf run create --flow . --data ./data.jsonl --init custom_connection=my_llm_connection --stream

Reference here for default behavior when column-mapping not provided in CLI.

  • List and show run meta
# list created run
pf run list
# get a sample run name
name=$(pf run list -r 10 | jq '.[] | select(.name | contains("eval_criteria_with_langchain_")) | .name'| head -n 1 | tr -d '"')
# show specific run detail
pf run show --name $name
# show output
pf run show-details --name $name
# show metrics
pf run show-metrics --name $name
# visualize run in browser
pf run visualize --name $name

Run flow in cloud

  • Assume we already have a connection named open_ai_connection in workspace.
# set default workspace
az account set -s <your_subscription_id>
az configure --defaults group=<your_resource_group_name> workspace=<your_workspace_name>
  • Create run
# run with environment variable reference connection in azureml workspace
pfazure run create --flow . --init init.json --data ./data.jsonl --stream

Использование cookies

Мы используем файлы cookie в соответствии с Политикой конфиденциальности и Политикой использования cookies.

Нажимая кнопку «Принимаю», Вы даете АО «СберТех» согласие на обработку Ваших персональных данных в целях совершенствования нашего веб-сайта и Сервиса GitVerse, а также повышения удобства их использования.

Запретить использование cookies Вы можете самостоятельно в настройках Вашего браузера.