Poster: Automated Dependency Mapping for Web API Security Testing Using Large Language Models
Li W., Guo Y.
CCS 2024 - Proceedings of the 2024 ACM SIGSAC Conference on Computer and Communications Security, pp. 5024-5026, 2024
Dependency extraction is crucial in web API security testing, as it helps identify the required API sequences to exploit a vulnerability. Traditional methods are generally rule-based and require extensive manual analysis of API specification documents by domain experts to formulate appropriate rules. This manual process is not only time-consuming and labor-intensive but also prone to missing dependencies and inaccuracies, which can compromise the effectiveness of security testing. In this paper, we explore the potential of large language models (LLMs) to automate dependency mapping in web APIs. By leveraging the capabilities of advanced LLMs such as GPT-3.5, Mistral-7B-Instruct, and Llama-3-8B-Instruct, which include understanding and generating natural language, we aim to streamline the dependency mapping process, reducing the need for manual analysis and enhancing accuracy. Our preliminary experiments demonstrate that this approach can effectively build dependency mappings, offering a a promising alternative to traditional rule-based approaches.