• Title/Summary/Keyword: Intelligent information systems

Search Result 4,219, Processing Time 0.035 seconds

Aspect-Based Sentiment Analysis Using BERT: Developing Aspect Category Sentiment Classification Models (BERT를 활용한 속성기반 감성분석: 속성카테고리 감성분류 모델 개발)

  • Park, Hyun-jung;Shin, Kyung-shik
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.4
    • /
    • pp.1-25
    • /
    • 2020
  • Sentiment Analysis (SA) is a Natural Language Processing (NLP) task that analyzes the sentiments consumers or the public feel about an arbitrary object from written texts. Furthermore, Aspect-Based Sentiment Analysis (ABSA) is a fine-grained analysis of the sentiments towards each aspect of an object. Since having a more practical value in terms of business, ABSA is drawing attention from both academic and industrial organizations. When there is a review that says "The restaurant is expensive but the food is really fantastic", for example, the general SA evaluates the overall sentiment towards the 'restaurant' as 'positive', while ABSA identifies the restaurant's aspect 'price' as 'negative' and 'food' aspect as 'positive'. Thus, ABSA enables a more specific and effective marketing strategy. In order to perform ABSA, it is necessary to identify what are the aspect terms or aspect categories included in the text, and judge the sentiments towards them. Accordingly, there exist four main areas in ABSA; aspect term extraction, aspect category detection, Aspect Term Sentiment Classification (ATSC), and Aspect Category Sentiment Classification (ACSC). It is usually conducted by extracting aspect terms and then performing ATSC to analyze sentiments for the given aspect terms, or by extracting aspect categories and then performing ACSC to analyze sentiments for the given aspect category. Here, an aspect category is expressed in one or more aspect terms, or indirectly inferred by other words. In the preceding example sentence, 'price' and 'food' are both aspect categories, and the aspect category 'food' is expressed by the aspect term 'food' included in the review. If the review sentence includes 'pasta', 'steak', or 'grilled chicken special', these can all be aspect terms for the aspect category 'food'. As such, an aspect category referred to by one or more specific aspect terms is called an explicit aspect. On the other hand, the aspect category like 'price', which does not have any specific aspect terms but can be indirectly guessed with an emotional word 'expensive,' is called an implicit aspect. So far, the 'aspect category' has been used to avoid confusion about 'aspect term'. From now on, we will consider 'aspect category' and 'aspect' as the same concept and use the word 'aspect' more for convenience. And one thing to note is that ATSC analyzes the sentiment towards given aspect terms, so it deals only with explicit aspects, and ACSC treats not only explicit aspects but also implicit aspects. This study seeks to find answers to the following issues ignored in the previous studies when applying the BERT pre-trained language model to ACSC and derives superior ACSC models. First, is it more effective to reflect the output vector of tokens for aspect categories than to use only the final output vector of [CLS] token as a classification vector? Second, is there any performance difference between QA (Question Answering) and NLI (Natural Language Inference) types in the sentence-pair configuration of input data? Third, is there any performance difference according to the order of sentence including aspect category in the QA or NLI type sentence-pair configuration of input data? To achieve these research objectives, we implemented 12 ACSC models and conducted experiments on 4 English benchmark datasets. As a result, ACSC models that provide performance beyond the existing studies without expanding the training dataset were derived. In addition, it was found that it is more effective to reflect the output vector of the aspect category token than to use only the output vector for the [CLS] token as a classification vector. It was also found that QA type input generally provides better performance than NLI, and the order of the sentence with the aspect category in QA type is irrelevant with performance. There may be some differences depending on the characteristics of the dataset, but when using NLI type sentence-pair input, placing the sentence containing the aspect category second seems to provide better performance. The new methodology for designing the ACSC model used in this study could be similarly applied to other studies such as ATSC.

Self-optimizing feature selection algorithm for enhancing campaign effectiveness (캠페인 효과 제고를 위한 자기 최적화 변수 선택 알고리즘)

  • Seo, Jeoung-soo;Ahn, Hyunchul
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.4
    • /
    • pp.173-198
    • /
    • 2020
  • For a long time, many studies have been conducted on predicting the success of campaigns for customers in academia, and prediction models applying various techniques are still being studied. Recently, as campaign channels have been expanded in various ways due to the rapid revitalization of online, various types of campaigns are being carried out by companies at a level that cannot be compared to the past. However, customers tend to perceive it as spam as the fatigue of campaigns due to duplicate exposure increases. Also, from a corporate standpoint, there is a problem that the effectiveness of the campaign itself is decreasing, such as increasing the cost of investing in the campaign, which leads to the low actual campaign success rate. Accordingly, various studies are ongoing to improve the effectiveness of the campaign in practice. This campaign system has the ultimate purpose to increase the success rate of various campaigns by collecting and analyzing various data related to customers and using them for campaigns. In particular, recent attempts to make various predictions related to the response of campaigns using machine learning have been made. It is very important to select appropriate features due to the various features of campaign data. If all of the input data are used in the process of classifying a large amount of data, it takes a lot of learning time as the classification class expands, so the minimum input data set must be extracted and used from the entire data. In addition, when a trained model is generated by using too many features, prediction accuracy may be degraded due to overfitting or correlation between features. Therefore, in order to improve accuracy, a feature selection technique that removes features close to noise should be applied, and feature selection is a necessary process in order to analyze a high-dimensional data set. Among the greedy algorithms, SFS (Sequential Forward Selection), SBS (Sequential Backward Selection), SFFS (Sequential Floating Forward Selection), etc. are widely used as traditional feature selection techniques. It is also true that if there are many risks and many features, there is a limitation in that the performance for classification prediction is poor and it takes a lot of learning time. Therefore, in this study, we propose an improved feature selection algorithm to enhance the effectiveness of the existing campaign. The purpose of this study is to improve the existing SFFS sequential method in the process of searching for feature subsets that are the basis for improving machine learning model performance using statistical characteristics of the data to be processed in the campaign system. Through this, features that have a lot of influence on performance are first derived, features that have a negative effect are removed, and then the sequential method is applied to increase the efficiency for search performance and to apply an improved algorithm to enable generalized prediction. Through this, it was confirmed that the proposed model showed better search and prediction performance than the traditional greed algorithm. Compared with the original data set, greed algorithm, genetic algorithm (GA), and recursive feature elimination (RFE), the campaign success prediction was higher. In addition, when performing campaign success prediction, the improved feature selection algorithm was found to be helpful in analyzing and interpreting the prediction results by providing the importance of the derived features. This is important features such as age, customer rating, and sales, which were previously known statistically. Unlike the previous campaign planners, features such as the combined product name, average 3-month data consumption rate, and the last 3-month wireless data usage were unexpectedly selected as important features for the campaign response, which they rarely used to select campaign targets. It was confirmed that base attributes can also be very important features depending on the type of campaign. Through this, it is possible to analyze and understand the important characteristics of each campaign type.

Target-Aspect-Sentiment Joint Detection with CNN Auxiliary Loss for Aspect-Based Sentiment Analysis (CNN 보조 손실을 이용한 차원 기반 감성 분석)

  • Jeon, Min Jin;Hwang, Ji Won;Kim, Jong Woo
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.4
    • /
    • pp.1-22
    • /
    • 2021
  • Aspect Based Sentiment Analysis (ABSA), which analyzes sentiment based on aspects that appear in the text, is drawing attention because it can be used in various business industries. ABSA is a study that analyzes sentiment by aspects for multiple aspects that a text has. It is being studied in various forms depending on the purpose, such as analyzing all targets or just aspects and sentiments. Here, the aspect refers to the property of a target, and the target refers to the text that causes the sentiment. For example, for restaurant reviews, you could set the aspect into food taste, food price, quality of service, mood of the restaurant, etc. Also, if there is a review that says, "The pasta was delicious, but the salad was not," the words "steak" and "salad," which are directly mentioned in the sentence, become the "target." So far, in ABSA, most studies have analyzed sentiment only based on aspects or targets. However, even with the same aspects or targets, sentiment analysis may be inaccurate. Instances would be when aspects or sentiment are divided or when sentiment exists without a target. For example, sentences like, "Pizza and the salad were good, but the steak was disappointing." Although the aspect of this sentence is limited to "food," conflicting sentiments coexist. In addition, in the case of sentences such as "Shrimp was delicious, but the price was extravagant," although the target here is "shrimp," there are opposite sentiments coexisting that are dependent on the aspect. Finally, in sentences like "The food arrived too late and is cold now." there is no target (NULL), but it transmits a negative sentiment toward the aspect "service." Like this, failure to consider both aspects and targets - when sentiment or aspect is divided or when sentiment exists without a target - creates a dual dependency problem. To address this problem, this research analyzes sentiment by considering both aspects and targets (Target-Aspect-Sentiment Detection, hereby TASD). This study detected the limitations of existing research in the field of TASD: local contexts are not fully captured, and the number of epochs and batch size dramatically lowers the F1-score. The current model excels in spotting overall context and relations between each word. However, it struggles with phrases in the local context and is relatively slow when learning. Therefore, this study tries to improve the model's performance. To achieve the objective of this research, we additionally used auxiliary loss in aspect-sentiment classification by constructing CNN(Convolutional Neural Network) layers parallel to existing models. If existing models have analyzed aspect-sentiment through BERT encoding, Pooler, and Linear layers, this research added CNN layer-adaptive average pooling to existing models, and learning was progressed by adding additional loss values for aspect-sentiment to existing loss. In other words, when learning, the auxiliary loss, computed through CNN layers, allowed the local context to be captured more fitted. After learning, the model is designed to do aspect-sentiment analysis through the existing method. To evaluate the performance of this model, two datasets, SemEval-2015 task 12 and SemEval-2016 task 5, were used and the f1-score increased compared to the existing models. When the batch was 8 and epoch was 5, the difference was largest between the F1-score of existing models and this study with 29 and 45, respectively. Even when batch and epoch were adjusted, the F1-scores were higher than the existing models. It can be said that even when the batch and epoch numbers were small, they can be learned effectively compared to the existing models. Therefore, it can be useful in situations where resources are limited. Through this study, aspect-based sentiments can be more accurately analyzed. Through various uses in business, such as development or establishing marketing strategies, both consumers and sellers will be able to make efficient decisions. In addition, it is believed that the model can be fully learned and utilized by small businesses, those that do not have much data, given that they use a pre-training model and recorded a relatively high F1-score even with limited resources.

Critical Success Factor of Noble Payment System: Multiple Case Studies (새로운 결제서비스의 성공요인: 다중사례연구)

  • Park, Arum;Lee, Kyoung Jun
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.4
    • /
    • pp.59-87
    • /
    • 2014
  • In MIS field, the researches on payment services are focused on adoption factors of payment service using behavior theories such as TRA(Theory of Reasoned Action), TAM(Technology Acceptance Model), and TPB (Theory of Planned Behavior). The previous researches presented various adoption factors according to types of payment service, nations, culture and so on even though adoption factors of identical payment service were presented differently by researchers. The payment service industry relatively has strong path dependency to the existing payment methods so that the research results on the identical payment service are different due to payment culture of nation. This paper aims to suggest a successful adoption factor of noble payment service regardless of nation's culture and characteristics of payment and prove it. In previous researches, common adoption factors of payment service are convenience, ease of use, security, convenience, speed etc. But real cases prove the fact that adoption factors that the previous researches present are not always critical to success to penetrate a market. For example, PayByPhone, NFC based parking payment service, successfully has penetrated to early market and grown. In contrast, Google Wallet service failed to be adopted to users despite NFC based payment method which provides convenience, security, ease of use. As shown in upper case, there remains an unexplained aspect. Therefore, the present research question emerged from the question: "What is the more essential and fundamental factor that should takes precedence over factors such as provides convenience, security, ease of use for successful penetration to market". With these cases, this paper analyzes four cases predicted on the following hypothesis and demonstrates it. "To successfully penetrate a market and sustainably grow, new payment service should find non-customer of the existing payment service and provide noble payment method so that they can use payment method". We give plausible explanations for the hypothesis using multiple case studies. Diners club, Danal, PayPal, Square were selected as a typical and successful cases in each category of payment service. The discussion on cases is primarily non-customer analysis that noble payment service targets on to find the most crucial factor in the early market, we does not attempt to consider factors for business growth. We clarified three-tier non-customer of the payment method that new payment service targets on and elaborated how new payment service satisfy them. In case of credit card, this payment service target first tier of non-customer who can't pay for because they don't have any cash temporarily but they have regular income. So credit card provides an opportunity which they can do economic activities by delaying the date of payment. In a result of wireless phone payment's case study, this service targets on second of non-customer who can't use online payment because they concern about security or have to take a complex process and learn how to use online payment method. Therefore, wireless phone payment provides very convenient payment method. Especially, it made group of young pay for a little money without a credit card. Case study result of PayPal, online payment service, shows that it targets on second tier of non-customer who reject to use online payment service because of concern about sensitive information leaks such as passwords and credit card details. Accordingly, PayPal service allows users to pay online without a provision of sensitive information. Final Square case result, Mobile POS -based payment service, also shows that it targets on second tier of non-customer who can't individually transact offline because of cash's shortness. Hence, Square provides dongle which function as POS by putting dongle in earphone terminal. As a result, four cases made non-customer their customer so that they could penetrate early market and had been extended their market share. Consequently, all cases supported the hypothesis and it is highly probable according to 'analytic generation' that case study methodology suggests. We present for judging the quality of research designs the following. Construct validity, internal validity, external validity, reliability are common to all social science methods, these have been summarized in numerous textbooks(Yin, 2014). In case study methodology, these also have served as a framework for assessing a large group of case studies (Gibbert, Ruigrok & Wicki, 2008). Construct validity is to identify correct operational measures for the concepts being studied. To satisfy construct validity, we use multiple sources of evidence such as the academic journals, magazine and articles etc. Internal validity is to seek to establish a causal relationship, whereby certain conditions are believed to lead to other conditions, as distinguished from spurious relationships. To satisfy internal validity, we do explanation building through four cases analysis. External validity is to define the domain to which a study's findings can be generalized. To satisfy this, replication logic in multiple case studies is used. Reliability is to demonstrate that the operations of a study -such as the data collection procedures- can be repeated, with the same results. To satisfy this, we use case study protocol. In Korea, the competition among stakeholders over mobile payment industry is intensifying. Not only main three Telecom Companies but also Smartphone companies and service provider like KakaoTalk announced that they would enter into mobile payment industry. Mobile payment industry is getting competitive. But it doesn't still have momentum effect notwithstanding positive presumptions that will grow very fast. Mobile payment services are categorized into various technology based payment service such as IC mobile card and Application payment service of cloud based, NFC, sound wave, BLE(Bluetooth Low Energy), Biometric recognition technology etc. Especially, mobile payment service is discontinuous innovations that users should change their behavior and noble infrastructure should be installed. These require users to learn how to use it and cause infra-installation cost to shopkeepers. Additionally, payment industry has the strong path dependency. In spite of these obstacles, mobile payment service which should provide dramatically improved value as a products and service of discontinuous innovations is focusing on convenience and security, convenience and so on. We suggest the following to success mobile payment service. First, non-customers of the existing payment service need to be identified. Second, needs of them should be taken. Then, noble payment service provides non-customer who can't pay by the previous payment method to payment method. In conclusion, mobile payment service can create new market and will result in extension of payment market.

Transfer Learning using Multiple ConvNet Layers Activation Features with Principal Component Analysis for Image Classification (전이학습 기반 다중 컨볼류션 신경망 레이어의 활성화 특징과 주성분 분석을 이용한 이미지 분류 방법)

  • Byambajav, Batkhuu;Alikhanov, Jumabek;Fang, Yang;Ko, Seunghyun;Jo, Geun Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.1
    • /
    • pp.205-225
    • /
    • 2018
  • Convolutional Neural Network (ConvNet) is one class of the powerful Deep Neural Network that can analyze and learn hierarchies of visual features. Originally, first neural network (Neocognitron) was introduced in the 80s. At that time, the neural network was not broadly used in both industry and academic field by cause of large-scale dataset shortage and low computational power. However, after a few decades later in 2012, Krizhevsky made a breakthrough on ILSVRC-12 visual recognition competition using Convolutional Neural Network. That breakthrough revived people interest in the neural network. The success of Convolutional Neural Network is achieved with two main factors. First of them is the emergence of advanced hardware (GPUs) for sufficient parallel computation. Second is the availability of large-scale datasets such as ImageNet (ILSVRC) dataset for training. Unfortunately, many new domains are bottlenecked by these factors. For most domains, it is difficult and requires lots of effort to gather large-scale dataset to train a ConvNet. Moreover, even if we have a large-scale dataset, training ConvNet from scratch is required expensive resource and time-consuming. These two obstacles can be solved by using transfer learning. Transfer learning is a method for transferring the knowledge from a source domain to new domain. There are two major Transfer learning cases. First one is ConvNet as fixed feature extractor, and the second one is Fine-tune the ConvNet on a new dataset. In the first case, using pre-trained ConvNet (such as on ImageNet) to compute feed-forward activations of the image into the ConvNet and extract activation features from specific layers. In the second case, replacing and retraining the ConvNet classifier on the new dataset, then fine-tune the weights of the pre-trained network with the backpropagation. In this paper, we focus on using multiple ConvNet layers as a fixed feature extractor only. However, applying features with high dimensional complexity that is directly extracted from multiple ConvNet layers is still a challenging problem. We observe that features extracted from multiple ConvNet layers address the different characteristics of the image which means better representation could be obtained by finding the optimal combination of multiple ConvNet layers. Based on that observation, we propose to employ multiple ConvNet layer representations for transfer learning instead of a single ConvNet layer representation. Overall, our primary pipeline has three steps. Firstly, images from target task are given as input to ConvNet, then that image will be feed-forwarded into pre-trained AlexNet, and the activation features from three fully connected convolutional layers are extracted. Secondly, activation features of three ConvNet layers are concatenated to obtain multiple ConvNet layers representation because it will gain more information about an image. When three fully connected layer features concatenated, the occurring image representation would have 9192 (4096+4096+1000) dimension features. However, features extracted from multiple ConvNet layers are redundant and noisy since they are extracted from the same ConvNet. Thus, a third step, we will use Principal Component Analysis (PCA) to select salient features before the training phase. When salient features are obtained, the classifier can classify image more accurately, and the performance of transfer learning can be improved. To evaluate proposed method, experiments are conducted in three standard datasets (Caltech-256, VOC07, and SUN397) to compare multiple ConvNet layer representations against single ConvNet layer representation by using PCA for feature selection and dimension reduction. Our experiments demonstrated the importance of feature selection for multiple ConvNet layer representation. Moreover, our proposed approach achieved 75.6% accuracy compared to 73.9% accuracy achieved by FC7 layer on the Caltech-256 dataset, 73.1% accuracy compared to 69.2% accuracy achieved by FC8 layer on the VOC07 dataset, 52.2% accuracy compared to 48.7% accuracy achieved by FC7 layer on the SUN397 dataset. We also showed that our proposed approach achieved superior performance, 2.8%, 2.1% and 3.1% accuracy improvement on Caltech-256, VOC07, and SUN397 dataset respectively compare to existing work.

A Proposal of a Keyword Extraction System for Detecting Social Issues (사회문제 해결형 기술수요 발굴을 위한 키워드 추출 시스템 제안)

  • Jeong, Dami;Kim, Jaeseok;Kim, Gi-Nam;Heo, Jong-Uk;On, Byung-Won;Kang, Mijung
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.3
    • /
    • pp.1-23
    • /
    • 2013
  • To discover significant social issues such as unemployment, economy crisis, social welfare etc. that are urgent issues to be solved in a modern society, in the existing approach, researchers usually collect opinions from professional experts and scholars through either online or offline surveys. However, such a method does not seem to be effective from time to time. As usual, due to the problem of expense, a large number of survey replies are seldom gathered. In some cases, it is also hard to find out professional persons dealing with specific social issues. Thus, the sample set is often small and may have some bias. Furthermore, regarding a social issue, several experts may make totally different conclusions because each expert has his subjective point of view and different background. In this case, it is considerably hard to figure out what current social issues are and which social issues are really important. To surmount the shortcomings of the current approach, in this paper, we develop a prototype system that semi-automatically detects social issue keywords representing social issues and problems from about 1.3 million news articles issued by about 10 major domestic presses in Korea from June 2009 until July 2012. Our proposed system consists of (1) collecting and extracting texts from the collected news articles, (2) identifying only news articles related to social issues, (3) analyzing the lexical items of Korean sentences, (4) finding a set of topics regarding social keywords over time based on probabilistic topic modeling, (5) matching relevant paragraphs to a given topic, and (6) visualizing social keywords for easy understanding. In particular, we propose a novel matching algorithm relying on generative models. The goal of our proposed matching algorithm is to best match paragraphs to each topic. Technically, using a topic model such as Latent Dirichlet Allocation (LDA), we can obtain a set of topics, each of which has relevant terms and their probability values. In our problem, given a set of text documents (e.g., news articles), LDA shows a set of topic clusters, and then each topic cluster is labeled by human annotators, where each topic label stands for a social keyword. For example, suppose there is a topic (e.g., Topic1 = {(unemployment, 0.4), (layoff, 0.3), (business, 0.3)}) and then a human annotator labels "Unemployment Problem" on Topic1. In this example, it is non-trivial to understand what happened to the unemployment problem in our society. In other words, taking a look at only social keywords, we have no idea of the detailed events occurring in our society. To tackle this matter, we develop the matching algorithm that computes the probability value of a paragraph given a topic, relying on (i) topic terms and (ii) their probability values. For instance, given a set of text documents, we segment each text document to paragraphs. In the meantime, using LDA, we can extract a set of topics from the text documents. Based on our matching process, each paragraph is assigned to a topic, indicating that the paragraph best matches the topic. Finally, each topic has several best matched paragraphs. Furthermore, assuming there are a topic (e.g., Unemployment Problem) and the best matched paragraph (e.g., Up to 300 workers lost their jobs in XXX company at Seoul). In this case, we can grasp the detailed information of the social keyword such as "300 workers", "unemployment", "XXX company", and "Seoul". In addition, our system visualizes social keywords over time. Therefore, through our matching process and keyword visualization, most researchers will be able to detect social issues easily and quickly. Through this prototype system, we have detected various social issues appearing in our society and also showed effectiveness of our proposed methods according to our experimental results. Note that you can also use our proof-of-concept system in http://dslab.snu.ac.kr/demo.html.

Analysis of media trends related to spent nuclear fuel treatment technology using text mining techniques (텍스트마이닝 기법을 활용한 사용후핵연료 건식처리기술 관련 언론 동향 분석)

  • Jeong, Ji-Song;Kim, Ho-Dong
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.2
    • /
    • pp.33-54
    • /
    • 2021
  • With the fourth industrial revolution and the arrival of the New Normal era due to Corona, the importance of Non-contact technologies such as artificial intelligence and big data research has been increasing. Convergent research is being conducted in earnest to keep up with these research trends, but not many studies have been conducted in the area of nuclear research using artificial intelligence and big data-related technologies such as natural language processing and text mining analysis. This study was conducted to confirm the applicability of data science analysis techniques to the field of nuclear research. Furthermore, the study of identifying trends in nuclear spent fuel recognition is critical in terms of being able to determine directions to nuclear industry policies and respond in advance to changes in industrial policies. For those reasons, this study conducted a media trend analysis of pyroprocessing, a spent nuclear fuel treatment technology. We objectively analyze changes in media perception of spent nuclear fuel dry treatment techniques by applying text mining analysis techniques. Text data specializing in Naver's web news articles, including the keywords "Pyroprocessing" and "Sodium Cooled Reactor," were collected through Python code to identify changes in perception over time. The analysis period was set from 2007 to 2020, when the first article was published, and detailed and multi-layered analysis of text data was carried out through analysis methods such as word cloud writing based on frequency analysis, TF-IDF and degree centrality calculation. Analysis of the frequency of the keyword showed that there was a change in media perception of spent nuclear fuel dry treatment technology in the mid-2010s, which was influenced by the Gyeongju earthquake in 2016 and the implementation of the new government's energy conversion policy in 2017. Therefore, trend analysis was conducted based on the corresponding time period, and word frequency analysis, TF-IDF, degree centrality values, and semantic network graphs were derived. Studies show that before the 2010s, media perception of spent nuclear fuel dry treatment technology was diplomatic and positive. However, over time, the frequency of keywords such as "safety", "reexamination", "disposal", and "disassembly" has increased, indicating that the sustainability of spent nuclear fuel dry treatment technology is being seriously considered. It was confirmed that social awareness also changed as spent nuclear fuel dry treatment technology, which was recognized as a political and diplomatic technology, became ambiguous due to changes in domestic policy. This means that domestic policy changes such as nuclear power policy have a greater impact on media perceptions than issues of "spent nuclear fuel processing technology" itself. This seems to be because nuclear policy is a socially more discussed and public-friendly topic than spent nuclear fuel. Therefore, in order to improve social awareness of spent nuclear fuel processing technology, it would be necessary to provide sufficient information about this, and linking it to nuclear policy issues would also be a good idea. In addition, the study highlighted the importance of social science research in nuclear power. It is necessary to apply the social sciences sector widely to the nuclear engineering sector, and considering national policy changes, we could confirm that the nuclear industry would be sustainable. However, this study has limitations that it has applied big data analysis methods only to detailed research areas such as "Pyroprocessing," a spent nuclear fuel dry processing technology. Furthermore, there was no clear basis for the cause of the change in social perception, and only news articles were analyzed to determine social perception. Considering future comments, it is expected that more reliable results will be produced and efficiently used in the field of nuclear policy research if a media trend analysis study on nuclear power is conducted. Recently, the development of uncontact-related technologies such as artificial intelligence and big data research is accelerating in the wake of the recent arrival of the New Normal era caused by corona. Convergence research is being conducted in earnest in various research fields to follow these research trends, but not many studies have been conducted in the nuclear field with artificial intelligence and big data-related technologies such as natural language processing and text mining analysis. The academic significance of this study is that it was possible to confirm the applicability of data science analysis technology in the field of nuclear research. Furthermore, due to the impact of current government energy policies such as nuclear power plant reductions, re-evaluation of spent fuel treatment technology research is undertaken, and key keyword analysis in the field can contribute to future research orientation. It is important to consider the views of others outside, not just the safety technology and engineering integrity of nuclear power, and further reconsider whether it is appropriate to discuss nuclear engineering technology internally. In addition, if multidisciplinary research on nuclear power is carried out, reasonable alternatives can be prepared to maintain the nuclear industry.

Sensory Information Processing

  • Yoshimoto, Chiyoshi
    • Journal of Biomedical Engineering Research
    • /
    • v.6 no.2
    • /
    • pp.1-8
    • /
    • 1985
  • The wall shear stress in the vicinity of end-to end anastomoses under steady flow conditions was measured using a flush-mounted hot-film anemometer(FMHFA) probe. The experimental measurements were in good agreement with numerical results except in flow with low Reynolds numbers. The wall shear stress increased proximal to the anastomosis in flow from the Penrose tubing (simulating an artery) to the PTFE: graft. In flow from the PTFE graft to the Penrose tubing, low wall shear stress was observed distal to the anastomosis. Abnormal distributions of wall shear stress in the vicinity of the anastomosis, resulting from the compliance mismatch between the graft and the host artery, might be an important factor of ANFH formation and the graft failure. The present study suggests a correlation between regions of the low wall shear stress and the development of anastomotic neointimal fibrous hyperplasia(ANPH) in end-to-end anastomoses. 30523 T00401030523 ^x Air pressure decay(APD) rate and ultrafiltration rate(UFR) tests were performed on new and saline rinsed dialyzers as well as those roused in patients several times. C-DAK 4000 (Cordis Dow) and CF IS-11 (Baxter Travenol) reused dialyzers obtained from the dialysis clinic were used in the present study. The new dialyzers exhibited a relatively flat APD, whereas saline rinsed and reused dialyzers showed considerable amount of decay. C-DAH dialyzers had a larger APD(11.70$\pm$1.32mmHg/min)compared to CF dialyzers(4.32$\pm$0.55mmHg/min)(p<0.05). However, there was no observable difference in the UFR between the two dialyzers. Neither APD nor UFR showed any significant increase with an increasing number of reuses for up to more than 20reuses. A substantial number of failures observed in APD(larger than 20mmHe/min)on the reused dialyzers(2 out of 40 CP and S out 26 C-DAK) were attributed to the Possible damage on the fibers. The CF 15-11 HFDs which failed APD test did not show changes in the UFR compared to normal dialyzers indicating that APD is a more sensitive test than UFR test to evaluate the integrity of the fibers. 30527 T00401030527 ^x For quantitative measurement of reflected light from a clinical diagnostic strip, a prototype old reflectance photometer was designed. The strip loader and cassette were made to obtain more accurate reflectance parameters. The strip was illuminated at 45˚c through optical fiber and the intensity of reflected light was determined at rectanguLat angle using a photodiode. The kubelka-munk coefficient and reflection optical density were determined ar four different wavelengths(500, 550, 570 and 610nm) for blood glucose strip. For higher concentration than 300mg/41 about glucose, a saturation state of abforbance was observed at 500, 550 and 570nm. The correlation between glucose concentration and parameters was the best at 610nm. 30535 T00401030535 ^x Radiation-induced fibrosarcoma tumors were grown on the flanks of C3H mice. The mice were divided into two groups. One group was injected with Photofrin II, intravenously (2.5mg/kg body weight). The other group received no Photofrin II. Mice from both groups were irradialed for approximately 15 minutes at 100, 300, or 500 mW/cm2 with the argon (488nm/514.5 nm), dye(628nm) and gold vapor (pulsed 628 nm) laser light. A photosensitizer behaved as an added absorber. Under our experimental conditions, the presence of Photolfrin II increased surface temperature by at least 40% and the temperature rise due to 300 mW/cm2 irradiation exceeded values for hyperthermia. Light and temperature distributions with depth were estimated by a computer model. The model demonstrated the influence of wavelength on the thermal process and proved to be a valuable tool to investigate internal temperature rise. 30536 T00401030536 ^x We investigated the structural geometry of thirty-eight Korean femurs. The purpose of this study is to identify major geometrical differences between Korean femurs 3nd others that we believe belong to Caucasians so that we would be able to get insights into the femoral component design that fits Asians including Koreans. We utilized computerized tomography (CT) images of femurs extracted from cadavers. The CT images were transformed into bitmap data by using a film scanner, and then analyzed by using a commercially available software called Image v.1.0 and a Macintosh IIci computer.The resulting data were compared with already published data. The major results show that the geometry of the Korean femurs is significantly different from that of Caucasians: (1) the anteversion angle and the canal flare index are greater by the amount of approximately 8˚ and 0.5, respectively, (2) the shape of the isthmus cross section is more round, and (3) the distance between the teaser trochanter and the proximal border of the isthmus is shelter by about 15 mm. The results suggested that the femoral component suitable for Asians should be different from the currently-used components designed and manufactured mostly by European or American companies. 30537 T00401030537 ^x It is well known that nonlinear propagation characteristics of the wave in the tissue may give very useful information for the medical diagnoisis. In this paper, a new method to detect nonlinear propagation characteristics of the internal vibration in the tissue for the low frequency mechanical vibration by using bispectral analysis is proposed. In the method, low frequency vibration of f0( = 100Hz) is applied on the surface of the object, and the waveform of the internal vibration x (t) is measured from Doppler frequency modulation of silmultaneously transmitted probing ultrasonic waves. Then, the bispectra of the signal x (t) at the frequencies (f0, f0) and (f0, 2f0) are calculated to estimate the nonlinear propagation characteristics as their magnitude ratio, w here since bispectrum is free from the gaussian additive noise we can get the value with high S/N. Basic experimental system is constructed by using 3.0 MHz probing ultrasonic waves and the several experiments are carried out for some phantoms. Results show the superiority of the proposed method to the conventional method using power spectrum and also its usefulness for the tissue characterization. 30541 T00401030541 ^x This paper describes the implementation of a computerized radial pulse diagnosis by aids of a clinical expert. On this base, we composed of the radial pulse diagnosis system in korean traditional medicine. The system composed of a radial pulse wave detection system and a radial pulse diagnosis system. With a detection system, we detected Inyoung and Cheongu radial pulse wave and processed it. Then, we have got the characteristic parameters of radial pulse wave and also quantified that according to the method of Inyoung-Cheongu Comparison Radial Pulse Diagnosis. We defined the jugement standard of radial pulse diagnosis system and then we confirmed the possibility for realization of automatic radial pulse diagnosis in korean traditional medicine. 30545 T00401030545 ^x Microspheres are expected to be applied to biomedical areas such as solid-phase immunoassays, drug delivery systems, immunomagnetic cell separation. To synthesize microspheres for biomedical application, "two stage shot growth method" was developed. The uniformity ratio of synthesized microspheres was always smaller than 1.05. And the surface charge density (or the number of ionizable functional groups) of the microspheres synthesized by "two stage shot growth method" was 6~13 times higher than that of the microspheres synthesized by conventional seeded batch copolymerization. As a previous step for biomedical application, adsorption experiments of bovine albumin on microspheres were carried out under various conditions. The maximum adsorbed amount was obtained in the neighborhood of pH 4.5. Isoelectric point of bovine albumin is pH 5.0, so experimental result shows that it shifted to acid area. The adsorption isotherm was obtained, the plateau region was always reached at 2.Og/L (bulk concentration of bovine albumin).The effect of the kind and the amount of surface functional group was also examined. 30575 T00401030575 ^x A medical image workstation was developed using multimedia technique. The system based on PC-486DX was designed to acquire medical images produced by medical imaging instruments and related audio information, that is, doctors' reporting results. Input information was processed and analyzed, then the results were presented in the form of graph and animation. All the informations of the system were hierarchically related with the image as the apex. Processing and analysis algorithms were implemented so that the diagnostic accuracy could be improved. The diagnosed information can be transferred for patient diagnosis through LAN(local area network). 30592 T00401030592 ^x In the conventional infrared imaging system, complex infrared lens systems are usually used for directing collimated narrow infrared beams into the high speed 2-dimensional optic scanner. In this paper, a simple reflective infrared optic system with a 2-dimensional optic scanner is proposed for the realization of medical infrared thermography system. It has been experimentally proven that the intfrared thermography system composed of the proposed optic system has the temperature resolution of 0.1˚c under the spatial resolution of lmrad, the image matrix size of 256 X 240, and tile imaging time of 4 seconds. 30593 T00401030593 ^x In this paper, MIIS (Medical Image Information System) has been designed and implemented using INGRES RDBMS, which is based on a client/server architecture. The implemented system allows users to register and retrieve patient information, medical images and diagnostic reports. It also provides the function to display these information on workstation windows simultaneously by using the designed menu-driven graphic user interface. The medical image compression/decompression techniques are implemented and integrated into the medical image database system for the efficient data storage and the fast access through the network. 30594 T00401030594 ^x In this paper, computerized BEAM was implemented for the space domain analysis of EEG. Trans-formation from temporal summation to two-dimensional mappings is formed by 4 nearest point inter-polaton method. Methods of representation of BEAM are two. One is dot density method which classify brain electrical potential 9 levels by dot density of gray levels and the other is colour method which classify brain electrical 12 levels by red-green colours. In this BEAM, instantaneous change and average energy distribution over any arbitrary time interval of brain electrical activity could be observed and analyzed easily. In the frequency domain, the distribution of energy spectrum of a special band can easily be distinguished normality and abnormality. 30608 T00401030608 ^x Laboratory information system (LIS) is a key tool to manage laboratory data in clinical pathology. Our department has developed an information system for routine hematology using down-sized computer system. We have used an IBM 486 compatible PC with 16MB main memory, 210 MB hard disk drive, 9 RS-232C port and 24 pin dot printer. The operating system and database management system were SCO UNIX and SCO foxbase, respectively. For program development, we used Xbase language provided by SCO foxbase. The C language was used for interface purpose. To make the system use friendly, pull-down menu was used. The system connected to our hospital information system via application program interface (API), so the information related to patient and request details is automatically transmitted to our computer. Our system interfaced with fwd complete blood count analyzers(Sysmex NE-8000 and Coulter STKS) for unidirectional data tansmission from analyzer to computer. The authors suggests that this system based on down-sized computer could provide a progressive approach to total LIS based on local area network, and the implemented system could serve as a model for other hospital's LIS for routine hematology. 30609 T00401030609 ^x To develop an artificial bone substitute that is gradually degraded and replaced by the regenerated natural bone, the authors designed a composite that is consisted of calcium phosphate and collagen. To use as the structural matrix of the composite, collagen was purified from human umbilical cord. The obtained collagen was treated by pepsin to remove telopeptides, and finally, the immune-free atelocollagen was produced: The cross linked atelocollagen was highly resistant to the collagenase induced collagenolysis. The cross linked collagen demonstrated an improved tensile strength. 30618 T00401030618 ^x This paper is a study on the design of adptive filter for QRS complex detection. We propose a simple adaptive algorithm to increase capability of noise cancelation in QRS complex detection with two stage adaptive filter. At the first stage, background noise is removed and at the next stage, only spectrum of QRS complex components is passed. Two adaptive filters can afford to keep track of the changes of both noise and QRS complex. Each adaptive filter consists of prediction error filter and FIR filter The impulse response of FIR filter uses coefficients of prediction error filter. The detection rates for 105 and 108 of MIT/BIH data base were 99.3% and 97.4% respectively. 30619 T00401030619 ^x To develop an artificial bone substitute that is gradually degraded and replaced by the regenerated natural bone, the authors designed and produced a composite that is consisted of calcium phosphate and collagen. Human umbilical cord origin pepsin treated type I atelocollagen was used as the structural matrix, by which sintered or non-sintered carbonate apatite was encapsulated to form an inorganic-organic composite. With cross linking atelocollagen by UV ray irradiation, the resistance to both compressive and tensile strength was increased. Collagen degradation by the collagenase induced collagenolysis was also decreased. 30620 T00401030620 ^x We have developed a monoleaflet polymer valve as an inexpensive and viable alternative, especially for short-term use in the ventricular assist device or total artificial heart. The frame and leaflet of the polymer valve were made from polyurethane, To evaluate the hemodynamic performance of the polymer valve a comparative study of flow dynamics past a polymer valve and a St. Jude Medical prosthetic valve under physiological pulsatile flow conditions in vitro was made. Comparisons between the valves were made on the transvalvular pressure drop, regurgitation volume and maximum valve opening area. The polymer valve showed smaller regurgitation volume and transvalvular pressure drop compared to the mechanical valve at higher heart rate. The results showed that the functional characteristics of the polymer valve compared favorably with those of the mechanical valve at higher heart rate. 30621 T00401030621 ^x Explosive evaporative removal process of biological tissue by absorption of a CW laser has been simulated by using gelatin and a multimode Nd:YAG laser. Because the point of maximun temperature of laser-irradiated gelatin exists below the surface due to surface cooling, evaporation at the boiling temperature is made explosively from below the surface. The important parameters of this process are the conduction loss to laser power absorption (defined as the conduction-to-laser power parameter, Nk), the convection heat transfer at the surface to conduction loss (defined as Bi), dimensionless extinction coefficient (defined as Br.), and dimensionless irradiation time (defined as Fo). Dependence of Fo on Nk and Bi has been observed by experiment, and the results have been compared with the numerical results obtained by solving a 2-dimensional conduction equation. Fo and explosion depth (from the surface to the point of maximun temperature) are increased when Nk and Bi are increased.To find out the minimum laser power for explosive evaporative removal process, steady state analysis has been also made. The limit of Nk to induce evaporative removal, which is proportional to the inverse of the laser power, has been obtained. 30622 T00401030622 ^x N1 and N2 gross neural action potentials were measured from the round window of the guinea pig cochlea at the onset of the acoustic stimuli. N1-N2 audiograms were made by means of regulating stimulant intensities in order to produce constant N1-N2 potentials as criteria for different input tone pip frequencies. The lowest threshold was measured with an input tone pip I5 dB SPL in intensity and 12 KHz in frequency when the animal was in normal physiological condition. The procedure of experimental measurements is explained in detail. This experimental approach is very useful for the investigation of the Cochlear function. Both noN1inear and active functions of the Cochlea can be monitored by N1-N2 audiograms. 30623 T00401030623 ^x In electrical impedance tomography(EIT), we use boundary current and voltage measurements toprovide the information about the cross-sectional distribution of electrical impedance or resistivity. One of the major problems in EIT has been the inaccessibility of internal voltage or current data in finding the internal impedance values. We propose a new image reconstruction method using internal current density data measured by NMR. We obtained a two-dimensional current density distribution within a phantom by processing the real and imaginary MR images from a 4.77 NMR machine. We implemented a resistivity mage reconstruction algorithm using the finite element method and sensitivity matrix. We presented computer simulation results of the mage reconstruction algorithm and furture direction of the research. 30624 T00401030624 ^x A new method of digital image analysis technique for discrimination of cancer cell was presented in this paper. The object image was the Thyroid eland cells image that was diagnosed as normal and abnormal (two types of abnormal: follicular neoplastic cell, and papillary neoplastic cell), respectively. By using the proposed region segmentation algorithm, the cells were segmented into nucleus. The 16 feature parameters were used to calculate the features of each nucleus. A9 a consequence of using dominant feature parameters method proposed in this paper, discrimination rate of 91.11% was obtained for Thyroid Gland cells. 30625 T00401030625 ^x An electrical stimulator was designed to induce locomotion for paraplegic patients caused by central nervous system injury. Optimal stimulus parameters, which can minimize muscle fatigue and can achieve effective muscle contraction were determined in slow and fast muscles in Sprague-Dawley rats. Stimulus patterns of our stimulator were designed to simulate electromyographic activity monitored during locomotion of normal subjects. Muscle types of the lower extremity were classified according to their mechanical property of contraction, which are slow muscle (msoleus m.) and fast muscle (medial gastrocneminus m., rectus femoris m., vastus lateralis m.). Optimal parameters of electrical stimulation for slow muscles were 20 Hz, 0.2 ms square pulse. For fast muscle, 40 Hz, 0.3 ms square pulse was optimal to produce repeated contraction. Higher stimulus intensity was required when synergistic muscles were stimulated simultaneously than when they were stimulated individually. Electrical stimulation for each muscle was designed to generate bipedal locomotion, so that individual muscles alternate contraction and relaxation to simulate stance and swing phases. Portable electrical stimulator with 16 channels built in microprocessor was constructed and applied to paraplegic patients due to lumbar cord injury. The electrical stimulator restored partially gait function in paraplegic patients. 30626 T00401030626 ^x Two-Dimensional modelling of the Cochlear biomechanics is presented in this paper. The Laplace partial differential equation which represents the fluid mechanics of the Cochlea has been transformed into two-dimensional electrical transmission line. The procedure of this transformation is explained in detail. The comparison between one and two dimensional models is also presented. This electrical modelling of the basilar membrane (BM) is clearly useful for the next approach to the further. Development of active elements which are essential in the producing of the sharp tuning of the BM. This paper shows that two-dimension model is qualitatively better than one-dimensional model both in amplitude and phase responses of the BM displacement. The present model is only for frequency response. However because the model is electrical, the two-dimensional transmission line model can be extended to time response without any difficult. 30627 T00401030627 ^x A method has been proposed for the fully automatic detection of left ventricular endocardial boundary in 2D short axis echocardiogram using geometric model. The procedure has the following three distinct stages. First, the initial center is estimated by the initial center estimation algorithm which is applied to decimated image. Second, the center estimation algorithm is applied to original image and then best-fit elliptic model estimation is processed. Third, best-fit boundary is detected by the cost function which is based on the best-fit elliptic model. The proposed method shows effective result without manual intervention by a human operator. 30628 T00401030628 ^x The intelligent trajectory control method that controls moving direction and average velocity for a prosthetic arm is proposed by pattern recognition and force estimations using EMG signals. Also, we propose the real time trajectory planning method which generates continuous accelleration paths using 3 stage linear filters to minimize the impact to human body induced by arm motions and to reduce the muscle fatigue. We use combination of MLP and fuzzy filter for pattern recognition to estimate the direction of a muscle and Hogan's method for the force estimation. EMG signals are acquired by using a amputation simulator and 2 dimensional joystick motion. The simulation results of proposed prosthetic arm control system using the EMf signals show that the arm is effectively followed the desired trajectory depended on estimated force and direction of muscle movements. 30638 T00401030638 ^x A new neural network architecture for the recognition of patterns from images is proposed, which is partially based on the results of physiological studies. The proposed network is composed of multi-layers and the nerve cells in each layer are connected by spatial filters which approximate receptive fields in optic nerve fields. In the proposed method, patterns recognition for complicated images is carried out using global features as well as local features such as lines and end-points. A new generating method of matched filers representing global features is proposed in this network. 30659 T00401030659 ^x An implementation scheme of the magnetic nerve stimulator using a switching mode power supply is proposed. By using a switching mode power supply rather than a conventional linear power supply for charging high voltage capacitors, the weight and size of the magnetic nerve stimulator can be considerably reduced. Maximum output voltage of the developed magnetic nerve stimulator using the switching mode power supply is 3, 000 volts and switching time is about 100 msec. Experimental results or human nerve stimulations using the developed stimulator are presented. 30768 T00401030768 ^x In this paper, we describe the design methodology and specifications of the developed module-based bedside monitors for patient monitoring. The bedside monitor consists of a main unit and module cases with various parameter modules. The main unit includes a 12.1" TFT color LCD, a main CPU board, and peripherals such as a module controller, Ethernet LAN card, video card, rotate/push button controller, etc. The main unit can connect at maximum three module cases each of which can accommodate up to 7 parameter modules. They include the modules for electrocardiograph, respiration, invasive blood pressure, noninvasive blood pressure, temperature, and SpO2 with Plethysmograph.SpO2 with Plethysmograph.

  • PDF

High Resolution HC$_3$N Observations toward the Central Region of Sagittarius B2

  • H.S-Ching;Oh, M.ishi;M.Morimoto
    • Bulletin of the Korean Space Science Society
    • /
    • 1993.10a
    • /
    • pp.17-17
    • /
    • 1993
  • The wall shear stress in the vicinity of end-to end anastomoses under steady flow conditions was measured using a flush-mounted hot-film anemometer(FMHFA) probe. The experimental measurements were in good agreement with numerical results except in flow with low Reynolds numbers. The wall shear stress increased proximal to the anastomosis in flow from the Penrose tubing (simulating an artery) to the PTFE: graft. In flow from the PTFE graft to the Penrose tubing, low wall shear stress was observed distal to the anastomosis. Abnormal distributions of wall shear stress in the vicinity of the anastomosis, resulting from the compliance mismatch between the graft and the host artery, might be an important factor of ANFH formation and the graft failure. The present study suggests a correlation between regions of the low wall shear stress and the development of anastomotic neointimal fibrous hyperplasia(ANPH) in end-to-end anastomoses. 30523 T00401030523 ^x Air pressure decay(APD) rate and ultrafiltration rate(UFR) tests were performed on new and saline rinsed dialyzers as well as those roused in patients several times. C-DAK 4000 (Cordis Dow) and CF IS-11 (Baxter Travenol) reused dialyzers obtained from the dialysis clinic were used in the present study. The new dialyzers exhibited a relatively flat APD, whereas saline rinsed and reused dialyzers showed considerable amount of decay. C-DAH dialyzers had a larger APD(11.70${\pm}$1.32mmHg/min)compared to CF dialyzers(4.32${\pm}$0.55mmHg/min)(p<0.05). However, there was no observable difference in the UFR between the two dialyzers. Neither APD nor UFR showed any significant increase with an increasing number of reuses for up to more than 20reuses. A substantial number of failures observed in APD(larger than 20mmHe/min)on the reused dialyzers(2 out of 40 CP and S out 26 C-DAK) were attributed to the Possible damage on the fibers. The CF 15-11 HFDs which failed APD test did not show changes in the UFR compared to normal dialyzers indicating that APD is a more sensitive test than UFR test to evaluate the integrity of the fibers. 30527 T00401030527 ^x For quantitative measurement of reflected light from a clinical diagnostic strip, a prototype old reflectance photometer was designed. The strip loader and cassette were made to obtain more accurate reflectance parameters. The strip was illuminated at 45˚c through optical fiber and the intensity of reflected light was determined at rectanguLat angle using a photodiode. The kubelka-munk coefficient and reflection optical density were determined ar four different wavelengths(500,550,570 and 610nm) for blood glucose strip. For higher concentration than 300mg/41 about glucose, a saturation state of abforbance was observed at 500,550 and 570nm. The correlation between glucose concentration and parameters was the best at 610nm. 30535 T00401030535 ^x Radiation-induced fibrosarcoma tumors were grown on the flanks of C3H mice. The mice were divided into two groups. One group was injected with Photofrin II, intravenously (2.5mg/kg body weight). The other group received no Photofrin II. Mice from both groups were irradialed for approximately 15 minutes at 100,300, or 500 mW/cm2 with the argon (488nm/514.5 nm), dye(628nm) and gold vapor (pulsed 628 nm) laser light. A photosensitizer behaved as an added absorber. Under our experimental conditions, the presence of Photolfrin II increased surface temperature by at least 40% and the temperature rise due to 300 mW/cm2 irradiation exceeded values for hyperthermia. Light and temperature distributions with depth were estimated by a computer model. The model demonstrated the influence of wavelength on the thermal process and proved to be a valuable tool to investigate internal temperature rise. 30536 T00401030536 ^x We investigated the structural geometry of thirty-eight Korean femurs. The purpose of this study is to identify major geometrical differences between Korean femurs 3nd others that we believe belong to Caucasians so that we would be able to get insights into the femoral component design that fits Asians including Koreans. We utilized computerized tomography (CT) images of femurs extracted from cadavers. The CT images were transformed into bitmap data by using a film scanner, and then analyzed by using a commercially available software called Image v.1.0 and a Macintosh IIci computer.The resulting data were compared with already published data. The major results show that the geometry of the Korean femurs is significantly different from that of Caucasians: (1) the anteversion angle and the canal flare index are greater by the amount of approximately 8˚ and 0.5, respectively, (2) the shape of the isthmus cross section is more round, and (3) the distance between the teaser trochanter and the proximal border of the isthmus is shelter by about 15 mm. The results suggested that the femoral component suitable for Asians should be different from the currently-used components designed and manufactured mostly by European or American companies. 30537 T00401030537 ^x It is well known that nonlinear propagation characteristics of the wave in the tissue may give very useful information for the medical diagnoisis. In this paper, a new method to detect nonlinear propagation characteristics of the internal vibration in the tissue for the low frequency mechanical vibration by using bispectral analysis is proposed. In the method, low frequency vibration of f0( = 100Hz) is applied on the surface of the object, and the waveform of the internal vibration x (t) is measured from Doppler frequency modulation of silmultaneously transmitted probing ultrasonic waves. Then, the bispectra of the signal x (t) at the frequencies (f0, f0) and (f0, 2f0) are calculated to estimate the nonlinear propagation characteristics as their magnitude ratio, w here since bispectrum is free from the gaussian additive noise we can get the value with high S/N. Basic experimental system is constructed by using 3.0 MHz probing ultrasonic waves and the several experiments are carried out for some phantoms. Results show the superiority of the proposed method to the conventional method using power spectrum and also its usefulness for the tissue characterization. 30541 T00401030541 ^x This paper describes the implementation of a computerized radial pulse diagnosis by aids of a clinical expert. On this base, we composed of the radial pulse diagnosis system in korean traditional medicine. The system composed of a radial pulse wave detection system and a radial pulse diagnosis system. With a detection system, we detected Inyoung and Cheongu radial pulse wave and processed it. Then, we have got the characteristic parameters of radial pulse wave and also quantified that according to the method of Inyoung-Cheongu Comparison Radial Pulse Diagnosis. We defined the jugement standard of radial pulse diagnosis system and then we confirmed the possibility for realization of automatic radial pulse diagnosis in korean traditional medicine. 30545 T00401030545 ^x Microspheres are expected to be applied to biomedical areas such as solid-phase immunoassays, drug delivery systems, immunomagnetic cell separation. To synthesize microspheres for biomedical application, "two stage shot growth method" was developed. The uniformity ratio of synthesized microspheres was always smaller than 1.05. And the surface charge density (or the number of ionizable functional groups) of the microspheres synthesized by "two stage shot growth method" was 6~13 times higher than that of the microspheres synthesized by conventional seeded batch copolymerization. As a previous step for biomedical application, adsorption experiments of bovine albumin on microspheres were carried out under various conditions. The maximum adsorbed amount was obtained in the neighborhood of pH 4.5. Isoelectric point of bovine albumin is pH 5.0, so experimental result shows that it shifted to acid area. The adsorption isotherm was obtained, the plateau region was always reached at 2.Og/L (bulk concentration of bovine albumin).The effect of the kind and the amount of surface functional group was also examined. 30575 T00401030575 ^x A medical image workstation was developed using multimedia technique. The system based on PC-486DX was designed to acquire medical images produced by medical imaging instruments and related audio information, that is, doctors' reporting results. Input information was processed and analyzed, then the results were presented in the form of graph and animation. All the informations of the system were hierarchically related with the image as the apex. Processing and analysis algorithms were implemented so that the diagnostic accuracy could be improved. The diagnosed information can be transferred for patient diagnosis through LAN(local area network). 30592 T00401030592 ^x In the conventional infrared imaging system, complex infrared lens systems are usually used for directing collimated narrow infrared beams into the high speed 2-dimensional optic scanner. In this paper, a simple reflective infrared optic system with a 2-dimensional optic scanner is proposed for the realization of medical infrared thermography system. It has been experimentally proven that the intfrared thermography system composed of the proposed optic system has the temperature resolution of 0.1˚c under the spatial resolution of lmrad, the image matrix size of 256 X 240, and tile imaging time of 4 seconds. 30593 T00401030593 ^x In this paper, MIIS (Medical Image Information System) has been designed and implemented using INGRES RDBMS, which is based on a client/server architecture. The implemented system allows users to register and retrieve patient information, medical images and diagnostic reports. It also provides the function to display these information on workstation windows simultaneously by using the designed menu-driven graphic user interface. The medical image compression/decompression techniques are implemented and integrated into the medical image database system for the efficient data storage and the fast access through the network. 30594 T00401030594 ^x In this paper, computerized BEAM was implemented for the space domain analysis of EEG. Trans-formation from temporal summation to two-dimensional mappings is formed by 4 nearest point inter-polaton method. Methods of representation of BEAM are two. One is dot density method which classify brain electrical potential 9 levels by dot density of gray levels and the other is colour method which classify brain electrical 12 levels by red-green colours. In this BEAM, instantaneous change and average energy distribution over any arbitrary time interval of brain electrical activity could be observed and analyzed easily. In the frequency domain, the distribution of energy spectrum of a special band can easily be distinguished normality and abnormality. 30608 T00401030608 ^x Laboratory information system (LIS) is a key tool to manage laboratory data in clinical pathology. Our department has developed an information system for routine hematology using down-sized computer system. We have used an IBM 486 compatible PC with 16MB main memory, 210 MB hard disk drive, 9 RS-232C port and 24 pin dot printer. The operating system and database management system were SCO UNIX and SCO foxbase, respectively. For program development, we used Xbase language provided by SCO foxbase. The C language was used for interface purpose. To make the system use friendly, pull-down menu was used. The system connected to our hospital information system via application program interface (API), so the information related to patient and request details is automatically transmitted to our computer. Our system interfaced with fwd complete blood count analyzers(Sysmex NE-8000 and Coulter STKS) for unidirectional data tansmission from analyzer to computer. The authors suggests that this system based on down-sized computer could provide a progressive approach to total LIS based on local area network, and the implemented system could serve as a model for other hospital's LIS for routine hematology. 30609 T00401030609 ^x To develop an artificial bone substitute that is gradually degraded and replaced by the regenerated natural bone, the authors designed a composite that is consisted of calcium phosphate and collagen. To use as the structural matrix of the composite, collagen was purified from human umbilical cord. The obtained collagen was treated by pepsin to remove telopeptides, and finally, the immune-free atelocollagen was produced: The cross linked atelocollagen was highly resistant to the collagenase induced collagenolysis. The cross linked collagen demonstrated an improved tensile strength. 30618 T00401030618 ^x This paper is a study on the design of adptive filter for QRS complex detection. We propose a simple adaptive algorithm to increase capability of noise cancelation in QRS complex detection with two stage adaptive filter. At the first stage, background noise is removed and at the next stage, only spectrum of QRS complex components is passed. Two adaptive filters can afford to keep track of the changes of both noise and QRS complex. Each adaptive filter consists of prediction error filter and FIR filter The impulse response of FIR filter uses coefficients of prediction error filter. The detection rates for 105 and 108 of MIT/BIH data base were 99.3% and 97.4% respectively. 30619 T00401030619 ^x To develop an artificial bone substitute that is gradually degraded and replaced by the regenerated natural bone, the authors designed and produced a composite that is consisted of calcium phosphate and collagen. Human umbilical cord origin pepsin treated type I atelocollagen was used as the structural matrix, by which sintered or non-sintered carbonate apatite was encapsulated to form an inorganic-organic composite. With cross linking atelocollagen by UV ray irradiation, the resistance to both compressive and tensile strength was increased. Collagen degradation by the collagenase induced collagenolysis was also decreased. 30620 T00401030620 ^x We have developed a monoleaflet polymer valve as an inexpensive and viable alternative, especially for short-term use in the ventricular assist device or total artificial heart. The frame and leaflet of the polymer valve were made from polyurethane, To evaluate the hemodynamic performance of the polymer valve a comparative study of flow dynamics past a polymer valve and a St. Jude Medical prosthetic valve under physiological pulsatile flow conditions in vitro was made. Comparisons between the valves were made on the transvalvular pressure drop, regurgitation volume and maximum valve opening area. The polymer valve showed smaller regurgitation volume and transvalvular pressure drop compared to the mechanical valve at higher heart rate. The results showed that the functional characteristics of the polymer valve compared favorably with those of the mechanical valve at higher heart rate. 30621 T00401030621 ^x Explosive evaporative removal process of biological tissue by absorption of a CW laser has been simulated by using gelatin and a multimode Nd:YAG laser. Because the point of maximun temperature of laser-irradiated gelatin exists below the surface due to surface cooling, evaporation at the boiling temperature is made explosively from below the surface. The important parameters of this process are the conduction loss to laser power absorption (defined as the conduction-to-laser power parameter, Nk), the convection heat transfer at the surface to conduction loss (defined as Bi), dimensionless extinction coefficient (defined as Br.), and dimensionless irradiation time (defined as Fo). Dependence of Fo on Nk and Bi has been observed by experiment, and the results have been compared with the numerical results obtained by solving a 2-dimensional conduction equation. Fo and explosion depth (from the surface to the point of maximun temperature) are increased when Nk and Bi are increased.To find out the minimum laser power for explosive evaporative removal process, steady state analysis has been also made. The limit of Nk to induce evaporative removal, which is proportional to the inverse of the laser power, has been obtained. 30622 T00401030622 ^x N1 and N2 gross neural action potentials were measured from the round window of the guinea pig cochlea at the onset of the acoustic stimuli. N1-N2 audiograms were made by means of regulating stimulant intensities in order to produce constant N1-N2 potentials as criteria for different input tone pip frequencies. The lowest threshold was measured with an input tone pip I5 dB SPL in intensity and 12 KHz in frequency when the animal was in normal physiological condition. The procedure of experimental measurements is explained in detail. This experimental approach is very useful for the investigation of the Cochlear function. Both noN1inear and active functions of the Cochlea can be monitored by N1-N2 audiograms. 30623 T00401030623 ^x In electrical impedance tomography(EIT), we use boundary current and voltage measurements toprovide the information about the cross-sectional distribution of electrical impedance or resistivity. One of the major problems in EIT has been the inaccessibility of internal voltage or current data in finding the internal impedance values. We propose a new image reconstruction method using internal current density data measured by NMR. We obtained a two-dimensional current density distribution within a phantom by processing the real and imaginary MR images from a 4.77 NMR machine. We implemented a resistivity mage reconstruction algorithm using the finite element method and sensitivity matrix. We presented computer simulation results of the mage reconstruction algorithm and furture direction of the research. 30624 T00401030624 ^x A new method of digital image analysis technique for discrimination of cancer cell was presented in this paper. The object image was the Thyroid eland cells image that was diagnosed as normal and abnormal (two types of abnormal: follicular neoplastic cell, and papillary neoplastic cell), respectively. By using the proposed region segmentation algorithm, the cells were segmented into nucleus. The 16 feature parameters were used to calculate the features of each nucleus. A9 a consequence of using dominant feature parameters method proposed in this paper, discrimination rate of 91.11% was obtained for Thyroid Gland cells. 30625 T00401030625 ^x An electrical stimulator was designed to induce locomotion for paraplegic patients caused by central nervous system injury. Optimal stimulus parameters, which can minimize muscle fatigue and can achieve effective muscle contraction were determined in slow and fast muscles in Sprague-Dawley rats. Stimulus patterns of our stimulator were designed to simulate electromyographic activity monitored during locomotion of normal subjects. Muscle types of the lower extremity were classified according to their mechanical property of contraction, which are slow muscle (msoleus m.) and fast muscle (medial gastrocneminus m., rectus femoris m., vastus lateralis m.). Optimal parameters of electrical stimulation for slow muscles were 20 Hz, 0.2 ms square pulse. For fast muscle, 40 Hz, 0.3 ms square pulse was optimal to produce repeated contraction. Higher stimulus intensity was required when synergistic muscles were stimulated simultaneously than when they were stimulated individually. Electrical stimulation for each muscle was designed to generate bipedal locomotion, so that individual muscles alternate contraction and relaxation to simulate stance and swing phases. Portable electrical stimulator with 16 channels built in microprocessor was constructed and applied to paraplegic patients due to lumbar cord injury. The electrical stimulator restored partially gait function in paraplegic patients. 30626 T00401030626 ^x Two-Dimensional modelling of the Cochlear biomechanics is presented in this paper. The Laplace partial differential equation which represents the fluid mechanics of the Cochlea has been transformed into two-dimensional electrical transmission line. The procedure of this transformation is explained in detail. The comparison between one and two dimensional models is also presented. This electrical modelling of the basilar membrane (BM) is clearly useful for the next approach to the further. Development of active elements which are essential in the producing of the sharp tuning of the BM. This paper shows that two-dimension model is qualitatively better than one-dimensional model both in amplitude and phase responses of the BM displacement. The present model is only for frequency response. However because the model is electrical, the two-dimensional transmission line model can be extended to time response without any difficult. 30627 T00401030627 ^x A method has been proposed for the fully automatic detection of left ventricular endocardial boundary in 2D short axis echocardiogram using geometric model. The procedure has the following three distinct stages. First, the initial center is estimated by the initial center estimation algorithm which is applied to decimated image. Second, the center estimation algorithm is applied to original image and then best-fit elliptic model estimation is processed. Third, best-fit boundary is detected by the cost function which is based on the best-fit elliptic model. The proposed method shows effective result without manual intervention by a human operator. 30628 T00401030628 ^x The intelligent trajectory control method that controls moving direction and average velocity for a prosthetic arm is proposed by pattern recognition and force estimations using EMG signals. Also, we propose the real time trajectory planning method which generates continuous accelleration paths using 3 stage linear filters to minimize the impact to human body induced by arm motions and to reduce the muscle fatigue. We use combination of MLP and fuzzy filter for pattern recognition to estimate the direction of a muscle and Hogan's method for the force estimation. EMG signals are acquired by using a amputation simulator and 2 dimensional joystick motion. The simulation results of proposed prosthetic arm control system using the EMf signals show that the arm is effectively followed the desired trajectory depended on estimated force and direction of muscle movements. 30638 T00401030638 ^x A new neural network architecture for the recognition of patterns from images is proposed, which is partially based on the results of physiological studies. The proposed network is composed of multi-layers and the nerve cells in each layer are connected by spatial filters which approximate receptive fields in optic nerve fields. In the proposed method, patterns recognition for complicated images is carried out using global features as well as local features such as lines and end-points. A new generating method of matched filers representing global features is proposed in this network. 30659 T00401030659 ^x An implementation scheme of the magnetic nerve stimulator using a switching mode power supply is proposed. By using a switching mode power supply rather than a conventional linear power supply for charging high voltage capacitors, the weight and size of the magnetic nerve stimulator can be considerably reduced. Maximum output voltage of the developed magnetic nerve stimulator using the switching mode power supply is 3,000 volts and switching time is about 100 msec. Experimental results or human nerve stimulations using the developed stimulator are presented. 30768 T00401030768 ^x In this paper, we describe the design methodology and specifications of the developed module-based bedside monitors for patient monitoring. The bedside monitor consists of a main unit and module cases with various parameter modules. The main unit includes a 12.1" TFT color LCD, a main CPU board, and peripherals such as a module controller, Ethernet LAN card, video card, rotate/push button controller, etc. The main unit can connect at maximum three module cases each of which can accommodate up to 7 parameter modules. They include the modules for electrocardiograph, respiration, invasive blood pressure, noninvasive blood pressure, temperature, and SpO2 with Plethysmograph.

  • PDF