• Title/Summary/Keyword: Analysis system

Search Result 69,895, Processing Time 0.102 seconds

EFFECT OF INSTRUMENT COMPLIANCE ON THE POLYMERIZATION SHRINKAGE STRESS MEASUREMENTS OF DENTAL RESIN COMPOSITES (측정장치의 compliance 유무가 복합레진의 중합수축음력의 측정에 미치는 영향)

  • Seo, Deog-Gyu;Min, Sun-Hong;Lee, In-Bog
    • Restorative Dentistry and Endodontics
    • /
    • v.34 no.2
    • /
    • pp.145-153
    • /
    • 2009
  • The purpose of this study was to evaluate the effect of instrument compliance on the polymerization shrinkage stress measurements of dental composites. The contraction strain and stress of composites during light curing were measured by a custom made stress-strain analyzer, which consisted of a displacement sensor, a cantilever load cell and a negative feedback mechanism. The instrument can measure the polymerization stress by two modes: with compliance mode in which the instrument compliance is allowed, or without compliance mode in which the instrument compliance is not allowed. A flowable (Filtek Flow: FF) and two universal hybrid (Z100: Z1 and Z250: Z2) composites were studied. A silane treated metal rod with a diameter of 3.0 mm was fixed at free end of the load cell, and other metal rod was fixed on the base plate. Composite of 1.0 mm thickness was placed between the two rods and light cured. The axial shrinkage strain and stress of the composite were recorded for 10 minutes during polymerization. and the tensile modulus of the materials was also determined with the instrument. The statistical analysis was conducted by ANOVA. paired t-test and Tukey's test (${\alpha}<0.05$). There were significant differences between the two measurement modes and among materials. With compliance mode, the contraction stress of FF was the highest: 3.11 (0.13). followed by Z1: 2.91 (0.10) and Z2: 1.94 (0.09) MPa. When the instrument compliance is not allowed, the contraction stress of Z1 was the highest: 17.08 (0.89), followed by FF: 10.11 (0.29) and Z2: 9.46 (1.63) MPa. The tensile modulus for Z1, Z2 and FF was 2.31 (0.18), 2.05 (0.20), 1.41 (0.11) GPa, respectively. With compliance mode. the measured stress correlated with the axial shrinkage strain of composite: while without compliance the elastic modulus of materials played a significant role in the stress measurement.

Korean Word Sense Disambiguation using Dictionary and Corpus (사전과 말뭉치를 이용한 한국어 단어 중의성 해소)

  • Jeong, Hanjo;Park, Byeonghwa
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.1
    • /
    • pp.1-13
    • /
    • 2015
  • As opinion mining in big data applications has been highlighted, a lot of research on unstructured data has made. Lots of social media on the Internet generate unstructured or semi-structured data every second and they are often made by natural or human languages we use in daily life. Many words in human languages have multiple meanings or senses. In this result, it is very difficult for computers to extract useful information from these datasets. Traditional web search engines are usually based on keyword search, resulting in incorrect search results which are far from users' intentions. Even though a lot of progress in enhancing the performance of search engines has made over the last years in order to provide users with appropriate results, there is still so much to improve it. Word sense disambiguation can play a very important role in dealing with natural language processing and is considered as one of the most difficult problems in this area. Major approaches to word sense disambiguation can be classified as knowledge-base, supervised corpus-based, and unsupervised corpus-based approaches. This paper presents a method which automatically generates a corpus for word sense disambiguation by taking advantage of examples in existing dictionaries and avoids expensive sense tagging processes. It experiments the effectiveness of the method based on Naïve Bayes Model, which is one of supervised learning algorithms, by using Korean standard unabridged dictionary and Sejong Corpus. Korean standard unabridged dictionary has approximately 57,000 sentences. Sejong Corpus has about 790,000 sentences tagged with part-of-speech and senses all together. For the experiment of this study, Korean standard unabridged dictionary and Sejong Corpus were experimented as a combination and separate entities using cross validation. Only nouns, target subjects in word sense disambiguation, were selected. 93,522 word senses among 265,655 nouns and 56,914 sentences from related proverbs and examples were additionally combined in the corpus. Sejong Corpus was easily merged with Korean standard unabridged dictionary because Sejong Corpus was tagged based on sense indices defined by Korean standard unabridged dictionary. Sense vectors were formed after the merged corpus was created. Terms used in creating sense vectors were added in the named entity dictionary of Korean morphological analyzer. By using the extended named entity dictionary, term vectors were extracted from the input sentences and then term vectors for the sentences were created. Given the extracted term vector and the sense vector model made during the pre-processing stage, the sense-tagged terms were determined by the vector space model based word sense disambiguation. In addition, this study shows the effectiveness of merged corpus from examples in Korean standard unabridged dictionary and Sejong Corpus. The experiment shows the better results in precision and recall are found with the merged corpus. This study suggests it can practically enhance the performance of internet search engines and help us to understand more accurate meaning of a sentence in natural language processing pertinent to search engines, opinion mining, and text mining. Naïve Bayes classifier used in this study represents a supervised learning algorithm and uses Bayes theorem. Naïve Bayes classifier has an assumption that all senses are independent. Even though the assumption of Naïve Bayes classifier is not realistic and ignores the correlation between attributes, Naïve Bayes classifier is widely used because of its simplicity and in practice it is known to be very effective in many applications such as text classification and medical diagnosis. However, further research need to be carried out to consider all possible combinations and/or partial combinations of all senses in a sentence. Also, the effectiveness of word sense disambiguation may be improved if rhetorical structures or morphological dependencies between words are analyzed through syntactic analysis.

Attention to the Internet: The Impact of Active Information Search on Investment Decisions (인터넷 주의효과: 능동적 정보 검색이 투자 결정에 미치는 영향에 관한 연구)

  • Chang, Young Bong;Kwon, YoungOk;Cho, Wooje
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.3
    • /
    • pp.117-129
    • /
    • 2015
  • As the Internet becomes ubiquitous, a large volume of information is posted on the Internet with exponential growth every day. Accordingly, it is not unusual that investors in stock markets gather and compile firm-specific or market-wide information through online searches. Importantly, it becomes easier for investors to acquire value-relevant information for their investment decision with the help of powerful search tools on the Internet. Our study examines whether or not the Internet helps investors assess a firm's value better by using firm-level data over long periods spanning from January 2004 to December 2013. To this end, we construct weekly-based search volume for information technology (IT) services firms on the Internet. We limit our focus to IT firms since they are often equipped with intangible assets and relatively less recognized to the public which makes them hard-to measure. To obtain the information on those firms, investors are more likely to consult the Internet and use the information to appreciate the firms more accurately and eventually improve their investment decisions. Prior studies have shown that changes in search volumes can reflect the various aspects of the complex human behaviors and forecast near-term values of economic indicators, including automobile sales, unemployment claims, and etc. Moreover, search volume of firm names or stock ticker symbols has been used as a direct proxy of individual investors' attention in financial markets since, different from indirect measures such as turnover and extreme returns, they can reveal and quantify the interest of investors in an objective way. Following this line of research, this study aims to gauge whether the information retrieved from the Internet is value relevant in assessing a firm. We also use search volume for analysis but, distinguished from prior studies, explore its impact on return comovements with market returns. Given that a firm's returns tend to comove with market returns excessively when investors are less informed about the firm, we empirically test the value of information by examining the association between Internet searches and the extent to which a firm's returns comove. Our results show that Internet searches are negatively associated with return comovements as expected. When sample is split by the size of firms, the impact of Internet searches on return comovements is shown to be greater for large firms than small ones. Interestingly, we find a greater impact of Internet searches on return comovements for years from 2009 to 2013 than earlier years possibly due to more aggressive and informative exploit of Internet searches in obtaining financial information. We also complement our analyses by examining the association between return volatility and Internet search volumes. If Internet searches capture investors' attention associated with a change in firm-specific fundamentals such as new product releases, stock splits and so on, a firm's return volatility is likely to increase while search results can provide value-relevant information to investors. Our results suggest that in general, an increase in the volume of Internet searches is not positively associated with return volatility. However, we find a positive association between Internet searches and return volatility when the sample is limited to larger firms. A stronger result from larger firms implies that investors still pay less attention to the information obtained from Internet searches for small firms while the information is value relevant in assessing stock values. However, we do find any systematic differences in the magnitude of Internet searches impact on return volatility by time periods. Taken together, our results shed new light on the value of information searched from the Internet in assessing stock values. Given the informational role of the Internet in stock markets, we believe the results would guide investors to exploit Internet search tools to be better informed, as a result improving their investment decisions.

The Intelligent Determination Model of Audience Emotion for Implementing Personalized Exhibition (개인화 전시 서비스 구현을 위한 지능형 관객 감정 판단 모형)

  • Jung, Min-Kyu;Kim, Jae-Kyeong
    • Journal of Intelligence and Information Systems
    • /
    • v.18 no.1
    • /
    • pp.39-57
    • /
    • 2012
  • Recently, due to the introduction of high-tech equipment in interactive exhibits, many people's attention has been concentrated on Interactive exhibits that can double the exhibition effect through the interaction with the audience. In addition, it is also possible to measure a variety of audience reaction in the interactive exhibition. Among various audience reactions, this research uses the change of the facial features that can be collected in an interactive exhibition space. This research develops an artificial neural network-based prediction model to predict the response of the audience by measuring the change of the facial features when the audience is given stimulation from the non-excited state. To present the emotion state of the audience, this research uses a Valence-Arousal model. So, this research suggests an overall framework composed of the following six steps. The first step is a step of collecting data for modeling. The data was collected from people participated in the 2012 Seoul DMC Culture Open, and the collected data was used for the experiments. The second step extracts 64 facial features from the collected data and compensates the facial feature values. The third step generates independent and dependent variables of an artificial neural network model. The fourth step extracts the independent variable that affects the dependent variable using the statistical technique. The fifth step builds an artificial neural network model and performs a learning process using train set and test set. Finally the last sixth step is to validate the prediction performance of artificial neural network model using the validation data set. The proposed model is compared with statistical predictive model to see whether it had better performance or not. As a result, although the data set in this experiment had much noise, the proposed model showed better results when the model was compared with multiple regression analysis model. If the prediction model of audience reaction was used in the real exhibition, it will be able to provide countermeasures and services appropriate to the audience's reaction viewing the exhibits. Specifically, if the arousal of audience about Exhibits is low, Action to increase arousal of the audience will be taken. For instance, we recommend the audience another preferred contents or using a light or sound to focus on these exhibits. In other words, when planning future exhibitions, planning the exhibition to satisfy various audience preferences would be possible. And it is expected to foster a personalized environment to concentrate on the exhibits. But, the proposed model in this research still shows the low prediction accuracy. The cause is in some parts as follows : First, the data covers diverse visitors of real exhibitions, so it was difficult to control the optimized experimental environment. So, the collected data has much noise, and it would results a lower accuracy. In further research, the data collection will be conducted in a more optimized experimental environment. The further research to increase the accuracy of the predictions of the model will be conducted. Second, using changes of facial expression only is thought to be not enough to extract audience emotions. If facial expression is combined with other responses, such as the sound, audience behavior, it would result a better result.

Selective Word Embedding for Sentence Classification by Considering Information Gain and Word Similarity (문장 분류를 위한 정보 이득 및 유사도에 따른 단어 제거와 선택적 단어 임베딩 방안)

  • Lee, Min Seok;Yang, Seok Woo;Lee, Hong Joo
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.4
    • /
    • pp.105-122
    • /
    • 2019
  • Dimensionality reduction is one of the methods to handle big data in text mining. For dimensionality reduction, we should consider the density of data, which has a significant influence on the performance of sentence classification. It requires lots of computations for data of higher dimensions. Eventually, it can cause lots of computational cost and overfitting in the model. Thus, the dimension reduction process is necessary to improve the performance of the model. Diverse methods have been proposed from only lessening the noise of data like misspelling or informal text to including semantic and syntactic information. On top of it, the expression and selection of the text features have impacts on the performance of the classifier for sentence classification, which is one of the fields of Natural Language Processing. The common goal of dimension reduction is to find latent space that is representative of raw data from observation space. Existing methods utilize various algorithms for dimensionality reduction, such as feature extraction and feature selection. In addition to these algorithms, word embeddings, learning low-dimensional vector space representations of words, that can capture semantic and syntactic information from data are also utilized. For improving performance, recent studies have suggested methods that the word dictionary is modified according to the positive and negative score of pre-defined words. The basic idea of this study is that similar words have similar vector representations. Once the feature selection algorithm selects the words that are not important, we thought the words that are similar to the selected words also have no impacts on sentence classification. This study proposes two ways to achieve more accurate classification that conduct selective word elimination under specific regulations and construct word embedding based on Word2Vec embedding. To select words having low importance from the text, we use information gain algorithm to measure the importance and cosine similarity to search for similar words. First, we eliminate words that have comparatively low information gain values from the raw text and form word embedding. Second, we select words additionally that are similar to the words that have a low level of information gain values and make word embedding. In the end, these filtered text and word embedding apply to the deep learning models; Convolutional Neural Network and Attention-Based Bidirectional LSTM. This study uses customer reviews on Kindle in Amazon.com, IMDB, and Yelp as datasets, and classify each data using the deep learning models. The reviews got more than five helpful votes, and the ratio of helpful votes was over 70% classified as helpful reviews. Also, Yelp only shows the number of helpful votes. We extracted 100,000 reviews which got more than five helpful votes using a random sampling method among 750,000 reviews. The minimal preprocessing was executed to each dataset, such as removing numbers and special characters from text data. To evaluate the proposed methods, we compared the performances of Word2Vec and GloVe word embeddings, which used all the words. We showed that one of the proposed methods is better than the embeddings with all the words. By removing unimportant words, we can get better performance. However, if we removed too many words, it showed that the performance was lowered. For future research, it is required to consider diverse ways of preprocessing and the in-depth analysis for the co-occurrence of words to measure similarity values among words. Also, we only applied the proposed method with Word2Vec. Other embedding methods such as GloVe, fastText, ELMo can be applied with the proposed methods, and it is possible to identify the possible combinations between word embedding methods and elimination methods.

Study on sea fog detection near Korea peninsula by using GMS-5 Satellite Data (GMS-5 위성자료를 이용한 한반도 주변 해무탐지 연구)

  • 윤홍주
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.4 no.4
    • /
    • pp.875-884
    • /
    • 2000
  • Sea fog/stratus is very difficult to detect because of the characteristics of air-sea interaction and locality ,and the scantiness of the observed data from the oceans such as ships or ocean buoys. The aim of our study develops new algorism for sea fog detection by using Geostational Meteorological Satellite-5(GMS-5) and suggests the technics of its continuous detection. In this study, atmospheric synoptic patterns on sea fog day of May, 1999 are classified; cold air advection type(OOUTC, May 10, 1999) and warm air advection type(OOUTC, May 12, 1999), respectively, and we collected two case days in order to analyze variations of water vapor at Osan observation station during May 9-10, 1999.So as to detect daytime sea fog/stratus(OOUTC, May 10, 1999), composite image, visible accumulated histogram method and surface albedo method are used. The characteristic value during day showed A(min) .20% and DA < 10% when visible accumulated histogram method was applied. And the sea fog region which is detected is similar in composite image analysis and surface albedo method. Inland observation which visibility and relative humidity is beneath 1Km and 80%, respectively, at OOUTC, May 10,1999; Poryoung for visble accumulated histogram method and Poryoung, Mokp'o and Kangnung for surface albedo method. In case of nighttime sea fog(18UTC, May 10, 1999), IR accumulated histogram method and Maximum brightness temperature method are used, respectively. Maxium brightness temperature method dectected sea fog better than IR accumulated histogram method with the charateristic value that is T_max < T_max_trs, and then T_max is beneath 700hPa temperature of GDAPS(Global Data Assimilation and Prediction System). Sea fog region which is detected by Maxium brighness temperature method was similar to the result of National Oceanic and Atmosheric Administratio/Advanced Very High Resolution Radiometer (NOAA/AVHRR) DCD(Dual Channel Difference), but usually visibility and relative humidity are not agreed well in inland.

  • PDF

Dose Distribution According to the Tissue Composition Using Wedge Filter by Radiochromic Film (쐐기필터 사용 시 레디오크로믹 필름을 이용한 조직에 따른 선량분포 연구)

  • Kim, Yon-Lae;Lee, Jeong-Woo;Park, Byung-Moon;Jung, Jae-Yong;Park, Ji-Yeon;Suh, Tae-Suk
    • Journal of radiological science and technology
    • /
    • v.35 no.2
    • /
    • pp.157-164
    • /
    • 2012
  • The purpose of this study is to analyze the dose distribution when wedge filter is used in the various tissue electron density materials. The dose distribution was assessed that the enhanced dynamic wedge filter and physical wedge filter were used in the solid water phantom, cork phantom, and air cavity. The film dosimetry was suitable simple to measure 2D dose distribution. Therefore, the radiochromic films (Gafchromic EBT2, ISP, NJ, USA) were selected to measure and to analyze the dose distributions. A linear accelerator using 6 MV photon were irradiated to field size of $10{\times}10cm^2$ with 400 MUs. The dose distributions of EBT2 films were analyzed the in-field area and penumbra regions by using dose analysis program. In the dose distributions of wedge field, the dose from a physical wedge was higher than that from a dynamic wedge at the same electron density materials. A dose distributions of wedge type in the solid water phantom and the cork phantom were in agreements with 2%. However, the dose distribution in air cavity showed the large difference with those in the solid water phantom or cork phantom dose distributions. Dose distribution of wedge field in air cavity was not shown the wedge effect. The penumbra width, out of the field of thick and thin, was observed larger from 1 cm to 2 cm at the thick end. The penumbra of physical wedge filter was much larger average 6% than the dynamic wedge filter. If the physical wedge filter is used, the dose was increased to effect the scatter that interacted with photon and physical wedge. In the case of difference in electron like the soft tissue, lung, and air, the transmission, absorption, and scattering were changed in the medium at high energy photon. Therefore, the treatment at the difference electron density should be inhomogeneity correction in treatment planning system.

An Analysis of Terrorism against Korea to Overseas and its Implications - Focusing on the companies advancing to overseas - (한국을 대상으로 한 국제테러리즘의 분석과 시사점 - 해외진출기업을 중심으로 -)

  • Chang, Suk-Heon;Lee, Dae-Sung
    • Korean Security Journal
    • /
    • no.28
    • /
    • pp.153-179
    • /
    • 2011
  • Korea has been a victim of State supporting terrorism by North Korea even before international society realize the terrorism threats because of 9.11 in US. However, state supporting terrorism against South Korea by North Korea went along with East/West Cold War System by US and the Soviet Union. It is because socialism that Kim Il-sung who established a separate government in North Korea with the political, economic, social and military support of the Soviet Union selected as his political ideology justifies terrorism as the tool to complete the proletariat revolution. North Korea's state supporting terrorism is being operated systematically and efficiently by military of North Korea. It gives big worries to international society not only by performing terrorism against Korea but also by dispatching terrorists and exporting terrorism strategies to the third world countries. In this situation, terrorism against Korea has met a new transition point at 9${\cdot}$11 in US. As South Korea is confronting North Korea and the war has not ended but suspended, the alliance between US and Korea is more important than anything else. Because of this Korea decided to support the anti-terrorism wars against Afghanistan and Iraq of US and other western countries and send military force there. The preface of the anti-terrorism war has begun as such. On October 7, 2001, US and UK started to attack Afghanistan and Taleban government in Afghanistan was dethroned on December 7, 2001. US and western countries started a war against Iraq on March 20, 2003. On April 9, 2003 Baghdad, the capital of Iraq fell, and Saddam Hussein al-Majid al-Awja government was expelled. During the process, the terrorism threat against South Korea has expanded to Arab terrorists and terrorism organizations as well as North Korea. Consequently, although Korean government, scholars and working level public servants made discussions and tried to seek countermeasures, the damages are extending. Accordingly, terrorism against Korean companies in overseas after 9${\cdot}$11 were analyzed focusing on Nation, Region, Victimology, and Weapons used for the attacks. Especially, the trend of terrorism against the Korean companies in overseas was discussed by classifying them chronologically such as initiation and termination of anti-terrorism wars against Afghanistan and Iraq, and from the execution of Iraqi President, Saddam Hussein al-Majid al-Awja to December 2010. Through this, possible terrorism incidents after the execution of Osama bin Laden, the leader of Al-Qaeda, on May 2, 2011 were projected and proposals were made for the countermeasures.

  • PDF

A Comparative Study about Industrial Structure Feature between TL Carriers and LTL Carriers (구역화물운송업과 노선화물운송업의 산업구조 특성 비교)

  • 민승기
    • Journal of Korean Society of Transportation
    • /
    • v.19 no.1
    • /
    • pp.101-114
    • /
    • 2001
  • Transportation enterprises should maintain constant and qualitative operation. Thus, in short period, transportation enterprises don't change supply in accordance with demand. In the result, transportation enterprises don't reduce operation in spite of management deficit at will. In freight transportation type, less-than-truckload(LTL) has more relation with above transportation feature than truckload(TL) does. Because freight transportation supply of TL is more flexible than that of LTL in correspondence of freight transportation demand. Relating to above mention, it appears that shortage of road and freight terminal of LTL is larger than that of TL. Especially in road and freight terminal comparison, shortage of freight terminal is larger than that of road. Shortage of road is the largest in 1990, and improved after-ward. But shortage of freight terminal is serious lately. So freight terminal needs more expansion than road, and shows better investment condition than road. Freight terminal expansion brings road expansion in LTL, on the contrary, freight terminal expansion substitutes freight terminal for road in TL. In transportation revenue, freight terminal's contribution to LTL is larger than that to TL. However, when we adjust quasi-fixed factor - road and freight terminal - to optimal level in the long run, in TL, diseconomies of scale becomes large, but in LTL, economies of scale becomes large. Consequently, it is necessary for TL to make counterplans to activate management of small size enterprises and owner drivers. And LTL should make use of economies of scale by solving the problem, such as nonprofit route, excess of rental freight handling of office, insufficiency of freight terminal, shortage of driver, and unpreparedness of freight insurance.

  • PDF

Clinical Usefulness of Contrast Echocardiography: The Dose Effect for Left Ventricle Visualization in Dogs (심초음파의 조영제의 임상적 유용성: 개에서 좌심영상화에 대한 조영제 용량의 영향)

  • Shin, Chang-ho;Hwang, Tae-sung;Yoon, Young-min;Jung, Dong-in;Yeon, Seong-chan;Lee, Hee-chun
    • Journal of Veterinary Clinics
    • /
    • v.32 no.6
    • /
    • pp.486-490
    • /
    • 2015
  • Two-demensional echocardiography is routinely used for evaluation of cardiac function. Visualization of the endocardial border is essential for the assessment of global and regional left ventricular with cardiac disease. SonoVue$^{TM}$ is a microbubble contrast agent that consists of sulfur hexafluoride-filled microbubbles in a phospholipid shell. There were many studies about contrast echocardiographic examination using SonoVue$^{TM}$ contrast agent, and various doses of SonoVue$^{TM}$ were used. To our knowledge, in published veterinary medicine, there was not reported for diagnostic efficient dose of SonoVue$^{TM}$ to evaluate contrast enhanced left ventricular endocardial border delineation (LVEBD). The purpose of this study is to compare the visualization time of LVEBD and find efficient dose of SonoVue$^{TM}$ for using various doses in dogs. Ten healthy Beagles were recruited to the study. Three different doses (0.03 ml/kg, 0.05 ml/kg and 0.1 ml/kg) of SonoVue$^{TM}$ were injected. Endocardial segments were assigned based on previously established methodology, where by the four-chamber views of the LV were divided into 6 segments. In this study, Contrast enhancement of the LVEBD after each injection was evaluated visually at the time point of overall contrast enhancement (Segmental scoring 5+) in the LV by three investigators in a blind manner. Statistical analysis was performed with SPSS version 14.0. All data were analyzed using one-way ANOVA, the multiple comparison Scheffe test. When data for the three offsite readers were combined, mean durations of useful contrast were $3.54({\pm}2.14)$, $6.15({\pm}2.61)$, and $24.39({\pm}11.10)$ seconds for the 0.03 ml/kg, 0.05 ml/kg, and 0.1 ml/kg SonoVue$^{TM}$ doses, respectively. After injection of contrast agent, there were no significant change in side effects such as urticaria, angioedema, hypersensitivity reactions, and digestive system disorders. This study suggests that efficient dose of SonoVue$^{TM}$ contrast agent for improvement of the left ventricle visualization is 0.1 ml/kg. The duration of useful enhancement of LVEBD and the reproducibility were also the highest at the 0.1 ml/kg dosage.