Returning flow back to the user /u/Amgadoz Python Education

What’s the best way to return the flow back to the user when using an LLM?

Details

I have a chatbot application where the user can ask the LLM about the user’s orders. The LLM has access to a set of tools that get data from a database. The db has customer_id as a primary key for the transactions table, and it should be passed to the tool that is querying the database.

Now the model should ideally ask the user for their customer_id when it’s needed for such tools. The way this is implemented is through another tool called get_customer_id which has a description such as retireves the user customer id.

This tool basically sends a pre-defined response such as {"role": "assistant", "content": "Please provide your customer id"}.

Now my question is:

There is a fastAPI endpoint that is defined as a function called def create_chat(): that handles the chat (think @app.post("/chat")). This high level function calls another function named handle_user_chat, which implements the chatbot logic. Inside handle_user_chat, there is another function named handle_tool_calls that basically sees which tool is called and execute it. So how should I implement the get_customer_id python function so that we return control back to the create_chat function and send a response to the POST request that has the {"role": "assistant", "content": "Please provide your customer id"} reponse.

The way I implemented is by raising a certain, predifned exception in get_customer_id as follows: def get_customer_id(): raise SomePredifned Exception And then check for this exception in the endpoint: @app.post("/chat") create_chat(): # Some code here try: # some code here except SomePredifned as ee: return "role": "assistant", "content": "Please provide your customer id"} except Exception as e: return {"message": "Some error was encountered"} (This is not the actual code but I hope you get the point.

Am I doing this correctly or is there a better way?

submitted by /u/Amgadoz
[link] [comments]

​r/learnpython What’s the best way to return the flow back to the user when using an LLM? Details I have a chatbot application where the user can ask the LLM about the user’s orders. The LLM has access to a set of tools that get data from a database. The db has customer_id as a primary key for the transactions table, and it should be passed to the tool that is querying the database. Now the model should ideally ask the user for their customer_id when it’s needed for such tools. The way this is implemented is through another tool called get_customer_id which has a description such as retireves the user customer id. This tool basically sends a pre-defined response such as {“role”: “assistant”, “content”: “Please provide your customer id”}. Now my question is: There is a fastAPI endpoint that is defined as a function called def create_chat(): that handles the chat (think @app.post(“/chat”)). This high level function calls another function named handle_user_chat, which implements the chatbot logic. Inside handle_user_chat, there is another function named handle_tool_calls that basically sees which tool is called and execute it. So how should I implement the get_customer_id python function so that we return control back to the create_chat function and send a response to the POST request that has the {“role”: “assistant”, “content”: “Please provide your customer id”} reponse. The way I implemented is by raising a certain, predifned exception in get_customer_id as follows: def get_customer_id(): raise SomePredifned Exception And then check for this exception in the endpoint: @app.post(“/chat”) create_chat(): # Some code here try: # some code here except SomePredifned as ee: return “role”: “assistant”, “content”: “Please provide your customer id”} except Exception as e: return {“message”: “Some error was encountered”} (This is not the actual code but I hope you get the point. Am I doing this correctly or is there a better way? submitted by /u/Amgadoz [link] [comments] 

What’s the best way to return the flow back to the user when using an LLM?

Details

I have a chatbot application where the user can ask the LLM about the user’s orders. The LLM has access to a set of tools that get data from a database. The db has customer_id as a primary key for the transactions table, and it should be passed to the tool that is querying the database.

Now the model should ideally ask the user for their customer_id when it’s needed for such tools. The way this is implemented is through another tool called get_customer_id which has a description such as retireves the user customer id.

This tool basically sends a pre-defined response such as {"role": "assistant", "content": "Please provide your customer id"}.

Now my question is:

There is a fastAPI endpoint that is defined as a function called def create_chat(): that handles the chat (think @app.post("/chat")). This high level function calls another function named handle_user_chat, which implements the chatbot logic. Inside handle_user_chat, there is another function named handle_tool_calls that basically sees which tool is called and execute it. So how should I implement the get_customer_id python function so that we return control back to the create_chat function and send a response to the POST request that has the {"role": "assistant", "content": "Please provide your customer id"} reponse.

The way I implemented is by raising a certain, predifned exception in get_customer_id as follows: def get_customer_id(): raise SomePredifned Exception And then check for this exception in the endpoint: @app.post("/chat") create_chat(): # Some code here try: # some code here except SomePredifned as ee: return "role": "assistant", "content": "Please provide your customer id"} except Exception as e: return {"message": "Some error was encountered"} (This is not the actual code but I hope you get the point.

Am I doing this correctly or is there a better way?

submitted by /u/Amgadoz
[link] [comments] 

Leave a Reply

Your email address will not be published. Required fields are marked *