• Title/Summary/Keyword: Technology Sharing

Search Result 2,060, Processing Time 0.033 seconds

A Clinical Study on the Relationship between Dental Implant and Systemic Disease (치과 임플란트와 전신질환과의 연관성에 관한 임상적 고찰)

  • Yang-Keum Han;Mi-Sook Yoon;Han-Hong Kim
    • Journal of Korean Dental Hygiene Science
    • /
    • v.6 no.2
    • /
    • pp.25-35
    • /
    • 2023
  • Background: In order to emphasize the importance of clinical dental hygienists-led dental hygiene management processes for those with systemic diseases, we tried to identify systemic diseases affecting dental implants based on clinical data. Methods: In order to identify systemic diseases affecting dental implants, literature review was conducted from March 1 to May 31, 2023, and the search period was for research papers published in domestic and foreign academic journals from January 2000 to December 2020. Domestic databases used for search use RISS, Nuri Media(DBpia), and Korea Academic Information (http://www.papersearch.net ; KISS), while overseas databases searched Pubmed for dental implant failures, implants, systemic diseases, and Dental implant and system disease. Results: The cumulative survival rate of implants averaged 94.3 percent and the failure rate was 5.7 percent. Clinical analysis of systemic diseases related to implants accounted for the highest frequency with 13 (100.0%), followed by 8 (61.5%) studies on high blood pressure and smoking, 7 (53.8%) cardiovascular diseases, and 5 (38.5%). In addition, liver disease, thyroid abnormalities, blood abnormalities, organ transplants and infectious diseases were confirmed. Conclusion: Since unregulated systemic diseases are a risk factor for implant failure, clinical dental hygienists should continue to maintain healthy oral conditions by sharing information with patients during periodic preventive dental hygiene management processes such as dental hygiene assessment.

A Study to Improve the Trustworthiness of Data Repositories by Obtaining CoreTrustSeal Certification (CoreTrustSeal 인증 획득을 통한 데이터 리포지토리의 신뢰성 향상을 위한 연구)

  • Hea Lim Rhee;Jung-Ho Um;Youngho Shin;Hyung-jun Yim;Na-eun Han
    • Journal of the Korean Society for information Management
    • /
    • v.41 no.2
    • /
    • pp.245-268
    • /
    • 2024
  • As the recognition of data's value increases, the role of data repositories in managing, preserving, and utilizing data is becoming increasingly important. This study investigates ways to enhance the trustworthiness of data repositories through obtaining CoreTrustSeal (CTS) certification. Trust in data repositories is critical not only for data protection but also for building and maintaining trust between the repository and stakeholders, which in turn affects researchers' decisions on depositing and utilizing data. The study examines the CoreTrustSeal, an international certification for trustworthy data repositories, analyzing its impact on the trustworthiness and efficiency of repositories. Using the example of DataON, Korea's first CTS-certified repository operated by the Korea Institute of Science and Technology Information (KISTI), the study compares and analyzes four repositories that have obtained CTS certification. These include DataON, the Physical Oceanography Distributed Active Archive Center (PO.DAAC) from NASA, Yareta from the University of Geneva, and the DARIAH-DE repository from Germany. The research assesses how these repositories meet the mandatory requirements set by CTS and proposes strategies for improving the trustworthiness of data repositories. Key findings indicate that obtaining CTS certification involves rigorous evaluation of organizational infrastructure, digital object management, and technological aspects. The study highlights the importance of transparent data processes, robust data quality assurance, enhanced accessibility and usability, sustainability, security measures, and compliance with legal and ethical standards. By implementing these strategies, data repositories can enhance their reliability and efficiency, ultimately promoting wider data sharing and utilization in the scientific community.

The Effects of Near Miss and Accident Prevention Activities and the Culture of Patient Safety Management for the Patient Safety (Near Miss 사고 예방 활동과 환자안전관리 문화형성이 환자안전에 미치는 영향)

  • Chang, Ho-Suk;Lee, Gui-Won
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.14 no.2
    • /
    • pp.138-144
    • /
    • 2010
  • Purpose: Despite the rapidly changing healthcare environment, healthcare organizations have recognized the importance of patient safety management. But patient safety management has the problem of the lack of participation of members due to the process of focusing on the follow-up service and punishment. The department of nuclear medicine in Uijeongbu St. Mary's Hospital started this research to reduce the near miss and prevent patient safety accidents by both initiating the participatory near-miss-proof activities as an advance management and constructing a system without disadvantages of reporting. In addition, this research aims to establish a differentiated patient safety management system in the department of nuclear medicine. Materials and Methods: 1. Colleting cases of team members' past and present near miss and accidents(First data collection). 2. Quantifying the cases of near miss and accidents after identifying the degree of importance and urgency through surveys(Second data collection). 3. Quantifying cases and indentifying important points of contact through data analysis. 4. Making and standardizing a manual for important points of contact, and initiating participatory activities to prevent errors. 5. Activating web-based community for establishing the report system of near miss. 6. Estimating the result of before and after activities through surveys and focus group interviews. Results: 1) Quantified safety accidents and near miss in the department of nuclear medicine. About 50 near misses a month and one safety accident a year. 2) Establishing improvement measurements based on quantified data. About 11 participatory activities, the improvement of process, a manual for standardization. 3) Creating a system of safety culture and high participation rate of team members. Constructing a report system, making a check list and a slogan for safety culture, and establishing assessment index. 4) Activating communities for sharing the information of cases of near misses and accidents. 5) As the result of activities, the rate of near miss occurrence declined by 50% and the safety accident did not happen. Conclusion: The best service in the department of nuclear medicine is to provide patients with safety-guaranteed high-quality examination and cure. This research started from the question, 'what is the most faithful-to-the-basics way to provide the best service for patients?' and team members' common answer for this question was building a system with participation of all members. Building a system through the participatory improvement activities for preventing near miss and creating safety culture resulted in the 50% decline of near miss occurrence and no accident. This is a meaningful result from the perspective of advance management for patient safety. Moreover, this research paved the way for creating a culture to report and admit near miss or accidents by establishing a report system with no disadvantage of reporting. The system which sticks to the basics is the best service for patients and will form a patient safety culture system, which will lead to the customer satisfaction. Therefore, all members of the department of nuclear medicine will develop a differentiated patient safety culture with stabilizing the established system.

  • PDF

A Research in Applying Big Data and Artificial Intelligence on Defense Metadata using Multi Repository Meta-Data Management (MRMM) (국방 빅데이터/인공지능 활성화를 위한 다중메타데이터 저장소 관리시스템(MRMM) 기술 연구)

  • Shin, Philip Wootaek;Lee, Jinhee;Kim, Jeongwoo;Shin, Dongsun;Lee, Youngsang;Hwang, Seung Ho
    • Journal of Internet Computing and Services
    • /
    • v.21 no.1
    • /
    • pp.169-178
    • /
    • 2020
  • The reductions of troops/human resources, and improvement in combat power have made Korean Department of Defense actively adapt 4th Industrial Revolution technology (Artificial Intelligence, Big Data). The defense information system has been developed in various ways according to the task and the uniqueness of each military. In order to take full advantage of the 4th Industrial Revolution technology, it is necessary to improve the closed defense datamanagement system.However, the establishment and usage of data standards in all information systems for the utilization of defense big data and artificial intelligence has limitations due to security issues, business characteristics of each military, anddifficulty in standardizing large-scale systems. Based on the interworking requirements of each system, data sharing is limited through direct linkage through interoperability agreement between systems. In order to implement smart defense using the 4th Industrial Revolution technology, it is urgent to prepare a system that can share defense data and make good use of it. To technically support the defense, it is critical to develop Multi Repository Meta-Data Management (MRMM) that supports systematic standard management of defense data that manages enterprise standard and standard mapping for each system and promotes data interoperability through linkage between standards which obeys the Defense Interoperability Management Development Guidelines. We introduced MRMM, and implemented by using vocabulary similarity using machine learning and statistical approach. Based on MRMM, We expect to simplify the standardization integration of all military databases using artificial intelligence and bigdata. This will lead to huge reduction of defense budget while increasing combat power for implementing smart defense.

Synthesis, Sytructure, and Magnetic Properties of One-Dimensional Thiophoshates, $Al_2NiP_2S_6$ (A=Rb, Cs) (1차원 구조를 갖는 Thiophoshates, $Al_2NiP_2S_6$ (A=Rb, Cs)의 합성, 구조 및 자기적 성질)

  • Dong, Yong Kwan;Lee, Kun Soo;Yun, Ho Seop;Hur, Nam Hwi
    • Journal of the Korean Chemical Society
    • /
    • v.45 no.3
    • /
    • pp.242-246
    • /
    • 2001
  • The quaternary thiophosphates, $A_2NiP_2S_6$ (A=Rb, Cs), have been synthesized with halide fluxes and structurally characterized by single-crystal X-ray diffraction technique. These compounds crystallize in the space group $C_{2h}^5-P2_1/n$ of the monoclinic system with two formula units in a cell of dimensions a=5.960(2), b=12.323(4), $c=7.491(3)\AA$, $\beta=97.05(3)^{\circ}$, and $V=546.0(3)\AA^3$ for Rb2NiP2S6 and a=5.957(4), b=12.696(7), $c=7.679(4)\AA$, $b=93.60(5)^{\circ}$, and $V=579.7(5)\AA^3$ for $Cs_2NiP_2S_6.$ These compounds are isostructural. The structure of $Cs_2NiP_2S_6$ is made up of one-dimensional $_\infty^1[NiP_2S_6^{2-}]$ chains along the a axis and these chains are isolated by $Cs^+$ ions. The Ni atom is octahedrally coordinated by six S atoms. These Ni$S_6$ octahedral units are linked by sharing three m-S atoms of the $[P_2S_6^{4-}]$ anions to form the infinite one-dimensional $_\infty^1[NiP_2S_6^{2-}]$ chain. For $Cs_2NiP_2S_6$, the magnetic susceptibility reveals an antiferromagnetic exchange interaction below 8K,which corresponds to the Neel temperature ($T_N$). Above $T_N$, this compound obeys Curie-Weiss law. The magnetic moment, C, and ${\theta}forCs_2NiP_2S_6$ are 2.77 B.M., 0.9593 K, and -19.02 K, respectively. The effective magnetic moment obtained from the magnetic data is agreed with the spin-only value of $Ni^{2+}d^8$(2.83 B.M.) system.

  • PDF

Comparative Analysis of ViSCa Platform-based Mobile Payment Service with other Cases (스마트카드 가상화(ViSCa) 플랫폼 기반 모바일 결제 서비스 제안 및 타 사례와의 비교분석)

  • Lee, June-Yeop;Lee, Kyoung-Jun
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.2
    • /
    • pp.163-178
    • /
    • 2014
  • Following research proposes "Virtualization of Smart Cards (ViSCa)" which is a security system that aims to provide a multi-device platform for the deployment of services that require a strong security protocol, both for the access & authentication and execution of its applications and focuses on analyzing Virtualization of Smart Cards (ViSCa) platform-based mobile payment service by comparing with other similar cases. At the present day, the appearance of new ICT, the diffusion of new user devices (such as smartphones, tablet PC, and so on) and the growth of internet penetration rate are creating many world-shaking services yet in the most of these applications' private information has to be shared, which means that security breaches and illegal access to that information are real threats that have to be solved. Also mobile payment service is, one of the innovative services, has same issues which are real threats for users because mobile payment service sometimes requires user identification, an authentication procedure and confidential data sharing. Thus, an extra layer of security is needed in their communication and execution protocols. The Virtualization of Smart Cards (ViSCa), concept is a holistic approach and centralized management for a security system that pursues to provide a ubiquitous multi-device platform for the arrangement of mobile payment services that demand a powerful security protocol, both for the access & authentication and execution of its applications. In this sense, Virtualization of Smart Cards (ViSCa) offers full interoperability and full access from any user device without any loss of security. The concept prevents possible attacks by third parties, guaranteeing the confidentiality of personal data, bank accounts or private financial information. The Virtualization of Smart Cards (ViSCa) concept is split in two different phases: the execution of the user authentication protocol on the user device and the cloud architecture that executes the secure application. Thus, the secure service access is guaranteed at anytime, anywhere and through any device supporting previously required security mechanisms. The security level is improved by using virtualization technology in the cloud. This virtualization technology is used terminal virtualization to virtualize smart card hardware and thrive to manage virtualized smart cards as a whole, through mobile cloud technology in Virtualization of Smart Cards (ViSCa) platform-based mobile payment service. This entire process is referred to as Smart Card as a Service (SCaaS). Virtualization of Smart Cards (ViSCa) platform-based mobile payment service virtualizes smart card, which is used as payment mean, and loads it in to the mobile cloud. Authentication takes place through application and helps log on to mobile cloud and chooses one of virtualized smart card as a payment method. To decide the scope of the research, which is comparing Virtualization of Smart Cards (ViSCa) platform-based mobile payment service with other similar cases, we categorized the prior researches' mobile payment service groups into distinct feature and service type. Both groups store credit card's data in the mobile device and settle the payment process at the offline market. By the location where the electronic financial transaction information (data) is stored, the groups can be categorized into two main service types. First is "App Method" which loads the data in the server connected to the application. Second "Mobile Card Method" stores its data in the Integrated Circuit (IC) chip, which holds financial transaction data, which is inbuilt in the mobile device secure element (SE). Through prior researches on accept factors of mobile payment service and its market environment, we came up with six key factors of comparative analysis which are economic, generality, security, convenience(ease of use), applicability and efficiency. Within the chosen group, we compared and analyzed the selected cases and Virtualization of Smart Cards (ViSCa) platform-based mobile payment service.

A Study on the Direction of Human Identity and Dignity Education in the AI Era. (AI시대, 인간의 정체성과 존엄성 교육의 방향)

  • Seo, Mikyoung
    • Journal of Christian Education in Korea
    • /
    • v.67
    • /
    • pp.157-194
    • /
    • 2021
  • The issue of AI's ethical consciousness has been constantly on the rise. AI learns and imitates everything behavior human beings do, just like a child. Therefore, the ethical consciousness we currently demand from AI is first the ethical consciousness required of humans, and at the center of it is the dignity of humans. Thus, this study analyzed human identity and its problems according to the development of AI technology, apologized the theological premises and characteristics of human dignity, and sought the direction of human dignity education as follows. First, this study discussed the development of AI and its relation to human beings. The development of AI's technology has led to the sharing of "reason or intelligence" with machines called AI which have been restricted to the exclusive property of mankind. This raised the question of the superior humanity which humans would be remained to be distinguished from AI machines. Second, this study discussed transhumanism and human identity. Transhumanism has been argued for the combination of AI machines and humans in order to improve inefficient human intelligence and human capabilities. However, the combination of AI machines with humans raised the issue of human identity. In the AI era, human identity is to believe thoughts that God had when he built us. Third, this study apologized theological premise and characteristic about human dignity. Human dignity has become a key concept of the constitution and international human rights treaties around the world. Nonetheless, declarative conviction that human is dignified is difficult to be understanded without Christian theological premise. Theological premise of human dignity lies on the fact that human is dignified feature being granted life by Heavenly Father. This feature lies on longing for "Goodness" and "eternality", pursuit of beauty, a happy being in relationship with others. Fourth, this study presented the direction of human dignity education. The direction of human dignity education has to awaken what is identity of human and how human beings were created and how much they are precious. Furthermore, it lead human to ponder consciously and accept the highest value of what human beings are, how they were created, and how precious they are. That is about educating human identity, and its core is that regardless of the circumstances - the wealth gap, knowledge level, skin color, gender, age, disability, etc. - all people are in God's image and for the glory of God, thereby being very important to God.

Deriving adoption strategies of deep learning open source framework through case studies (딥러닝 오픈소스 프레임워크의 사례연구를 통한 도입 전략 도출)

  • Choi, Eunjoo;Lee, Junyeong;Han, Ingoo
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.4
    • /
    • pp.27-65
    • /
    • 2020
  • Many companies on information and communication technology make public their own developed AI technology, for example, Google's TensorFlow, Facebook's PyTorch, Microsoft's CNTK. By releasing deep learning open source software to the public, the relationship with the developer community and the artificial intelligence (AI) ecosystem can be strengthened, and users can perform experiment, implementation and improvement of it. Accordingly, the field of machine learning is growing rapidly, and developers are using and reproducing various learning algorithms in each field. Although various analysis of open source software has been made, there is a lack of studies to help develop or use deep learning open source software in the industry. This study thus attempts to derive a strategy for adopting the framework through case studies of a deep learning open source framework. Based on the technology-organization-environment (TOE) framework and literature review related to the adoption of open source software, we employed the case study framework that includes technological factors as perceived relative advantage, perceived compatibility, perceived complexity, and perceived trialability, organizational factors as management support and knowledge & expertise, and environmental factors as availability of technology skills and services, and platform long term viability. We conducted a case study analysis of three companies' adoption cases (two cases of success and one case of failure) and revealed that seven out of eight TOE factors and several factors regarding company, team and resource are significant for the adoption of deep learning open source framework. By organizing the case study analysis results, we provided five important success factors for adopting deep learning framework: the knowledge and expertise of developers in the team, hardware (GPU) environment, data enterprise cooperation system, deep learning framework platform, deep learning framework work tool service. In order for an organization to successfully adopt a deep learning open source framework, at the stage of using the framework, first, the hardware (GPU) environment for AI R&D group must support the knowledge and expertise of the developers in the team. Second, it is necessary to support the use of deep learning frameworks by research developers through collecting and managing data inside and outside the company with a data enterprise cooperation system. Third, deep learning research expertise must be supplemented through cooperation with researchers from academic institutions such as universities and research institutes. Satisfying three procedures in the stage of using the deep learning framework, companies will increase the number of deep learning research developers, the ability to use the deep learning framework, and the support of GPU resource. In the proliferation stage of the deep learning framework, fourth, a company makes the deep learning framework platform that improves the research efficiency and effectiveness of the developers, for example, the optimization of the hardware (GPU) environment automatically. Fifth, the deep learning framework tool service team complements the developers' expertise through sharing the information of the external deep learning open source framework community to the in-house community and activating developer retraining and seminars. To implement the identified five success factors, a step-by-step enterprise procedure for adoption of the deep learning framework was proposed: defining the project problem, confirming whether the deep learning methodology is the right method, confirming whether the deep learning framework is the right tool, using the deep learning framework by the enterprise, spreading the framework of the enterprise. The first three steps (i.e. defining the project problem, confirming whether the deep learning methodology is the right method, and confirming whether the deep learning framework is the right tool) are pre-considerations to adopt a deep learning open source framework. After the three pre-considerations steps are clear, next two steps (i.e. using the deep learning framework by the enterprise and spreading the framework of the enterprise) can be processed. In the fourth step, the knowledge and expertise of developers in the team are important in addition to hardware (GPU) environment and data enterprise cooperation system. In final step, five important factors are realized for a successful adoption of the deep learning open source framework. This study provides strategic implications for companies adopting or using deep learning framework according to the needs of each industry and business.

Impact of Net-Based Customer Service on Firm Profits and Consumer Welfare (기업의 온라인 고객 서비스가 기업의 수익 및 고객의 후생에 미치는 영향에 관한 연구)

  • Kim, Eun-Jin;Lee, Byung-Tae
    • Asia pacific journal of information systems
    • /
    • v.17 no.2
    • /
    • pp.123-137
    • /
    • 2007
  • The advent of the Internet and related Web technologies has created an easily accessible link between a firm and its customers, and has provided opportunities to a firm to use information technology to support supplementary after-sale services associated with a product or service. It has been widely recognized that supplementary services are an important source of customer value and of competitive advantage as the characteristics of the product itself. Many of these supplementary services are information-based and need not be co-located with the product, so more and more companies are delivering these services electronically. Net-based customer service, which is defined as an Internet-based computerized information system that delivers services to a customer, therefore, is the core infrastructure for supplementary service provision. The importance of net-based customer service in delivering supplementary after-sale services associated with product has been well documented. The strategic advantages of well-implemented net-based customer service are enhanced customer loyalty and higher lock-in of customers, and a resulting reduction in competition and the consequent increase in profits. However, not all customers utilize such net-based customer service. The digital divide is the phenomenon in our society that captures the observation that not all customers have equal access to computers. Socioeconomic factors such as race, gender, and education level are strongly related to Internet accessibility and ability to use. This is due to the differences in the ability to bear the cost of a computer, and the differences in self-efficacy in the use of a technology, among other reasons. This concept, applied to e-commerce, has been called the "e-commerce divide." High Internet penetration is not eradicating the digital divide and e-commerce divide as one would hope. Besides, to accommodate personalized support, a customer must often provide personal information to the firm. This personal information includes not only name and address, but also preferences information and perhaps valuation information. However, many recent studies show that consumers may not be willing to share information about themselves due to concerns about privacy online. Due to the e-commerce divide, and due to privacy and security concerns of the customer for sharing personal information with firms, limited numbers of customers adopt net-based customer service. The limited level of customer adoption of net-based customer service affects the firm profits and the customers' welfare. We use a game-theoretic model in which we model the net-based customer service system as a mechanism to enhance customers' loyalty. We model a market entry scenario where a firm (the incumbent) uses the net-based customer service system in inducing loyalty in its customer base. The firm sells one product through the traditional retailing channels and at a price set for these channels. Another firm (the entrant) enters the market, and having observed the price of the incumbent firm (and after deducing the loyalty levels in the customer base), chooses its price. The profits of the firms and the surplus of the two customers segments (the segment that utilizes net-based customer service and the segment that does not) are analyzed in the Stackelberg leader-follower model of competition between the firms. We find that an increase in adoption of net-based customer service by the customer base is not always desirable for firms. With low effectiveness in enhancing customer loyalty, firms prefer a high level of customer adoption of net-based customer service, because an increase in adoption rate decreases competition and increases profits. A firm in an industry where net-based customer service is highly effective loyalty mechanism, on the other hand, prefers a low level of adoption by customers.

Case Study on the Enterprise Microblog Usage: Focusing on Knowledge Management Strategy (기업용 마이크로블로그의 사용행태에 대한 사례연구: 지식경영전략을 중심으로)

  • Kang, Min Su;Park, Arum;Lee, Kyoung-Jun
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.1
    • /
    • pp.47-63
    • /
    • 2015
  • As knowledge is paid attention as a new production factor that generates added value, studies continue to apply knowledge management to business environment. In addition, as ICT (Information Communication Technology) was engrafted in business environment, it leads to increasing task efficiency and productivity of individual workers. Accordingly, the way that a business achieves its goal has changed to one in which its individual members are willing to take part in the organization and share information to create new values (Han, 2003) and studies for the system and service to support such transition are carrying out. Of late, a new concept called 'Enterprise 2.0' newly appears. It is the extension of Wen 2.0 and its technology, which focus on participation, sharing and openness, to the work environment of a business (Jung, 2013). Enterprise 2.0 is being used as a collaborative tool to prop up individual creativity and group brain power by combining Web 2.0 technologies such as blog, Wiki, RSS and tag with business software (McAfee, 2006). As Tweeter gets popular, Enterprise Microblog (EMB), which is an example of Enterprise 2.0 for business, has been developed as equivalent to Tweeter in business circle and SaaS (Software as a Service) such as Yammer was introduced The studies of EMB mainly focus on demonstrating its usability in terms of intra-firm communication and knowledge management. However existing studies lean too much towards large-sized companies and certain departments, rather than a company as a whole. Therefore, few studies have been conducted on small and medium-sized companies that have difficulty preparing separate resources and supplying exclusive workforce to introduce knowledge management. In this respect, the present study placed its analytic focus on small-sized companies actually equipped with EMB to know how they use it. And, based on the findings, this study examined their knowledge management strategies for EMB from the point of codification and personalization. Hypothesis -"as a company grows, it shifts EMB strategy from codification to personalization'- was established on the basis of reviewing precedent studies and literature. To demonstrate the hypothesis, this study analyzed the usage of EMB by small companies that have used it from foundation. For case study, the duration of the use was divided into 2 spans and longitudinal analysis was employed to examine the contents of the blogs. Using the key findings of the analysis, this study is aimed to propose practical implications for the operation of knowledge management of small-sized company and the suitable application of knowledge management system for operation Knowledge Management Strategy can be classified by codification strategy and personalization strategy (Hansen et. al., 1999), and how to manage the two strategies were always studied. Also, current studies regarding the knowledge management strategy were targeted mostly for major companies, resulting in lack of studies in how it can be applied on SMEs. This research, with the knowledge management strategy suited for SMEs, sets an Enterprise Microblog (EMB), and with the EMB applied on SMEs' Knowledge Management Strategy, it is reviewed on the perspective of SMEs' Codification and Personalization Strategies. Through the advanced research regarding Knowledge Management Strategy and EMB, the hypothesis is set that "Depending on the development of the company, the main application of EMB alters from Codification Strategy to Personalization Strategy". To check the hypothesis, SME that have used the EMB called 'Yammer' was analyzed from the date of their foundation until today. The case study has implemented longitudinal analysis which divides the period when the EMBs were used into three stages and analyzes the contents. As the result of the study, this suggests a substantial implication regarding the application of Knowledge Management Strategy and its Knowledge Management System that is suitable for SME.