Acknowledgement
본 연구는 과학기술정보통신부 및 정보통신기획평가원의 대학 ICT 연구센터사업의 연구결과로 수행되었음(IITP-No.2024-2021-0-01817, No.RS-2020-II201373, No.2022-0-00498)
References
- Brown, Tom B, "Language models are few-shot learners.", arXiv preprint arXiv:2005.14165, 2020.
- Devlin, Jacob, "Bert: Pre-training of deep bidirectional transformers for language understanding.", arXiv preprint arXiv:1810.04805, 2018.
- Dubey, Abhimanyu, et al, "The llama 3 herd of models.", arXiv preprint arXiv:2407.21783, 2024.
- Lewis, Patrick, et al, "Retrieval-augmented generation for knowledge-intensive nlp tasks.", Advances in Neural Information Processing Systems 33, (2020): 9459-9474.
- Douze, Matthijs, et al. "The faiss library." arXiv preprint arXiv:2401.08281, 2024.
- Chroma is the open-source AI application database. Batteries included., https://www.trychroma.com/
- Almeida, Felipe, and Geraldo Xexeo, "Word embeddings: A survey.", arXiv preprint arXiv:1901.09069, 2019.
- Chroma Technical Report, https://research.trychroma.com/evaluating-chunking
- Faiss indexes, https://github.com/facebookresearch/faiss/wiki/Faiss-indexes