Download PDFOpen PDF in browser

A Case Study on the Generative AI Project Life Cycle Using Large Language Models

11 pagesPublished: March 21, 2024

Abstract

Large Language Models represent a disruptive technology set to revolutionize the fu- ture of artificial intelligence. While numerous literature reviews and survey articles discuss their benefits and address security and compliance concerns, there remains a shortage of research exploring the implementation life cycle of generative AI systems. This paper addresses this gap by presenting the various phases of the generative AI life cycle and detailing the development of a chatbot designed to address inquiries from prospective stu- dents. Utilizing Google Flan LLM and a question-answering pipeline, we processed user prompts. In addition, we compiled an input file containing domain knowledge of the edu- cation program, which was preprocessed and condensed into vector embeddings using the HuggingFace library. Furthermore, we designed a chat interface for user interaction using Streamlit. The responses generated by the chatbot are both descriptive and contextu- ally pertinent to the prompts, with their quality improving in response to more detailed prompts. However, a significant constraint is the size limit of the input file, given the processing power limitations of CPUs.

Keyphrases: chatbot, chatgpt, embeddings, gpt, large language models (llms), natural language processing (nlp), transformer, vector database

In: Ajay Bandi, Mohammad Hossain and Ying Jin (editors). Proceedings of 39th International Conference on Computers and Their Applications, vol 98, pages 189-199.

BibTeX entry
@inproceedings{CATA2024:Case_Study_Generative_AI,
  author    = {Ajay Bandi and Hemanth Kagitha},
  title     = {A Case Study on the Generative AI Project Life Cycle Using Large Language Models},
  booktitle = {Proceedings of 39th International Conference on Computers and Their Applications},
  editor    = {Ajay Bandi and Mohammad Hossain and Ying Jin},
  series    = {EPiC Series in Computing},
  volume    = {98},
  publisher = {EasyChair},
  bibsource = {EasyChair, https://easychair.org},
  issn      = {2398-7340},
  url       = {/publications/paper/GJZq},
  doi       = {10.29007/hvzc},
  pages     = {189-199},
  year      = {2024}}
Download PDFOpen PDF in browser