• Title/Summary/Keyword: 삼성SDS

Search Result 22, Processing Time 0.02 seconds

Study on Corporate Facebook Posts and User Engagement of the KOSPI 100 Companies in Korea: Difference between B2B and B2C Companies (국내 100대 기업 페이스북 콘텐츠 전략과 인게이지먼트 연구: B2B·B2C 기업 간 차이를 중심으로)

  • Jo, Joohong;Ko, Chaeeun;Baek, Hyunmi
    • Knowledge Management Research
    • /
    • v.23 no.3
    • /
    • pp.65-88
    • /
    • 2022
  • Companies actively engage with the public through social media to enhance sales and promote brand awareness, which was further encouraged by the pandemic. However, previous studies tend to consider companies as a group of identical features. This study focuses on the differences between B2B and B2C companies' social media content strategy in relation to user engagement. This study categorized KOSPI 100 companies that manage Facebook corporate fan pages into B2B and B2C, and then analyzed the contents they posted from January 1 to December 31, 2020. The result showed that B2C companies tended to use videos over images, prefer hashtags, and comment its product name more often compared to B2B companies. B2B companies preferred images, used more hyperlinks, and mentioned its company name more often. In B2B companies, images and length of text had positive effects on user engagement, while hyperlink and URL had negative effects. B2C companies' text length had positive effect on user engagement. This study provides practical implications to PR practitioners for establishing a social media strategy which enhances user engagement.

Generative AI service implementation using LLM application architecture: based on RAG model and LangChain framework (LLM 애플리케이션 아키텍처를 활용한 생성형 AI 서비스 구현: RAG모델과 LangChain 프레임워크 기반)

  • Cheonsu Jeong
    • Journal of Intelligence and Information Systems
    • /
    • v.29 no.4
    • /
    • pp.129-164
    • /
    • 2023
  • In a situation where the use and introduction of Large Language Models (LLMs) is expanding due to recent developments in generative AI technology, it is difficult to find actual application cases or implementation methods for the use of internal company data in existing studies. Accordingly, this study presents a method of implementing generative AI services using the LLM application architecture using the most widely used LangChain framework. To this end, we reviewed various ways to overcome the problem of lack of information, focusing on the use of LLM, and presented specific solutions. To this end, we analyze methods of fine-tuning or direct use of document information and look in detail at the main steps of information storage and retrieval methods using the retrieval augmented generation (RAG) model to solve these problems. In particular, similar context recommendation and Question-Answering (QA) systems were utilized as a method to store and search information in a vector store using the RAG model. In addition, the specific operation method, major implementation steps and cases, including implementation source and user interface were presented to enhance understanding of generative AI technology. This has meaning and value in enabling LLM to be actively utilized in implementing services within companies.