News Writer

Date: Oct 2023

Finetuned two LLMs, Microsoft Phi 1.5 and Google Flan t5 XXL, a small scale (1.3B) and a large scale (11B) LLM using public datasets available in HuggingFace. Analyzed the two LLMs outputs and compared the LLMs architecture. Built a gradio web application to host and compare the large language model’s output.

Skills / Tools:

TransformerNLPGradioLangChain