keyboard_arrow_up
Analysis and Advancement in Domain-Specific Templated Question Answering

Authors

Aaditya Baranwal, Jyotin Goel, Prashant Tandon, Renu Sankhla and Sukriti Goyal, Indian Institute of Technology Jodhpur, India

Abstract

This work addresses the challenge of domain-specific question answering through the intelligent composition of tool sequences using a large language model. We formulate the problem as utilizing a set of tools T to answer a query Q by determining the necessary tools, arguments, and execution sequence. Our approach enhances language model capabilities through prompt engineering, leveraging advanced reasoning, and adopting our custom Chain of Thoughts (CoT) inspired strategy for dynamic, cascaded user engagement. Employing multi-task learning broadens knowledge scope, while transfer learning from domains with richer tooling enhances versatility. Runtime compute costs are optimized through distillation. The evaluation shows our method excels in selecting optimal tool combinations for domain-specific queries, outperforming baseline approaches in accuracy and coverage. This approach provides a reusable framework for constructing proficient and cost-effective domain-specific Question Answering (QA) solutions. Key explorations encompass analysis of prompt engineering for tool interfaces, compositional learning across tools, transfer learning from richer domains, and prompt distillation. These facilitate the practical deployment of LLMs for industrial applications.

Keywords

Query, Tool, Tool Retrieval, Chain of Thoughts(CoT) Prompting, Prompt Engineering, QA, Distillation Step by Step, Array of Thoughts(AoT), GPT, LLM, Rationale .

Full Text  Volume 14, Number 8