• Title/Summary/Keyword: the automatic

Search Result 14,197, Processing Time 0.048 seconds

The Variation of Free Amino Acid during the Tomato Processing (토마토 가공(加工) 공정(工程) 중(中)에 있어서의 유리(遊離) 아미노 산(酸)의 변동(變動))

  • Kim, Seung Yeol;Kato, Hiromichi;Okitani, Akihiro;Hayase, Fumitaka
    • Korean Journal of Agricultural Science
    • /
    • v.9 no.2
    • /
    • pp.576-583
    • /
    • 1982
  • The variation of free amino acids during the tomato producing was studied using a tomato variety, Kagome 77. The concentration of free amino acids in fresh and heated pulp, and in puree and paste was analyzed by using automatic amino acid analyzer, Hitachi model KLA-5. 1. A significant difference in decomposition rate of glutamine and asparagine among amide group was recognized. For instance, the glutamine decomposed fast and no glutamine was found in the paste, while 56% of asparagine was found in the paste. 2. The diminishing quantity of glutamic acid among acid group was highest among all free amino acids. The quantity of aspartic acid was next to the glutamine. The percents of glutamic acid and aspartic acid left over were 38% and 24%, respectively. 3. Glycine, alanine, valine, isoleucine and leucine of neutral amino acids tended to be reduced a little during the heating, concentrating process. 4. No apparent variation was found for the lysine and histidine belonging to basic amino acids. while arginine increased a little. 5. Tyrosine, phenylalanine and tryptophane of aromatic group seemed to increase a little during the heating process. But the variations of them during the concentrating process were not recognized. 6. The methionine content, sulfur containing amino acid decreased a little throughout the process. But the decrease of ${\gamma}-amino$ butyric acid of non-protein was not apparently recognized. 7. The amino acid contents of fresh pulp were found as following order: glutamic acid>${\gamma}$-amino butyric acid>glutamine>aspartic acid>asparagine. The amino acid contents of paste were as glutamic acid>${\gamma}$-amino butyric acid>aspartic acid and aspargine. The percent distribution of aromatic and basic amino acids increased, even it was not great. 8. When amino acids were analyzed by Hitachi KLA-5, unknown peak which was never app eared in the fresh pulp before tryptophane was appeared when processed. The peak became greater when heated and concentrated. Later it was known that the peak was not due to lysinoalanine or ornithine.

  • PDF

Analyzing Heart Rate Variability for Automatic Sleep Stage Classification (수면단계 자동분류를 위한 심박동변이도 분석)

  • 김원식;김교헌;박세진;신재우;윤영로
    • Science of Emotion and Sensibility
    • /
    • v.6 no.4
    • /
    • pp.9-14
    • /
    • 2003
  • Sleep stages have been useful indicator to check a person's comfortableness in a sleep, But the traditional method of scoring sleep stages with polysomnography based on the integrated analysis of the electroencephalogram(EEG), electrooculogram(EOG), electrocardiogram(ECG), and electromyogram(EMG) is too restrictive to take a comfortable sleep for the participants, While the sympathetic nervous system is predominant during a wakefulness, the parasympathetic nervous system is more active during a sleep, Cardiovascular function is controlled by this autonomic nervous system, So, we have interpreted the heart rate variability(HRV) among sleep stages to find a simple method of classifying sleep stages, Six healthy male college students participated, and 12 night sleeps were recorded in this research, Sleep stages based on the "Standard scoring system for sleep stage" were automatically classified with polysomnograph by measuring EEG, EOG, ECG, and EMG(chin and leg) for the six participants during sleeping, To extract only the ECG signals from the polysomnograph and to interpret the HRV, a Sleep Data Acquisition/Analysis System was devised in this research, The power spectrum of HRV was divided into three ranges; low frequency(LF), medium frequency(MF), and high frequency(HF), It showed that, the LF/HF ratio of the Stage W(Wakefulness) was 325% higher than that of the Stage 2(p<.05), 628% higher than that of the Stage 3(p<.001), and 800% higher than that of the Stage 4(p<.001), Moreover, this ratio of the Stage 4 was 427% lower than that of the Stage REM (rapid eye movement) (p<.05) and 418% lower than that of the Stage l(p<.05), respectively, It was observed that the LF/HF ratio decreased monotonously as the sleep stage changes from the Stage W, Stage REM, Stage 1, Stage 2, Stage 3, to Stage 4, While the difference of the MF/(LF+HF) ratio among sleep Stages was not significant, it was higher in the Stage REM and Stage 3 than that of in the other sleep stages in view of descriptive statistic analysis for the sample group.

  • PDF

Comparative Analysis of GNSS Precipitable Water Vapor and Meteorological Factors (GNSS 가강수량과 기상인자의 상호 연관성 분석)

  • Jae Sup, Kim;Tae-Suk, Bae
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.33 no.4
    • /
    • pp.317-324
    • /
    • 2015
  • GNSS was firstly proposed for application in weather forecasting in the mid-1980s. It has continued to demonstrate the practical uses in GNSS meteorology, and other relevant researches are currently being conducted. Precipitable Water Vapor (PWV), calculated based on the GNSS signal delays due to the troposphere of the Earth, represents the amount of the water vapor in the atmosphere, and it is therefore widely used in the analysis of various weather phenomena such as monitoring of weather conditions and climate change detection. In this study we calculated the PWV through the meteorological information from an Automatic Weather Station (AWS) as well as GNSS data processing of a Continuously Operating Reference Station (CORS) in order to analyze the heavy snowfall of the Ulsan area in early 2014. Song’s model was adopted for the weighted mean temperature model (Tm), which is the most important parameter in the calculation of PWV. The study period is a total of 56 days (February 2013 and 2014). The average PWV of February 2014 was determined to be 11.29 mm, which is 11.34% lower than that of the heavy snowfall period. The average PWV of February 2013 was determined to be 10.34 mm, which is 8.41% lower than that of not the heavy snowfall period. In addition, certain meteorological factors obtained from AWS were compared as well, resulting in a very low correlation of 0.29 with the saturated vapor pressure calculated using the empirical formula of Magnus. The behavioral pattern of PWV has a tendency to change depending on the precipitation type, specifically, snow or rain. It was identified that the PWV showed a sudden increase and a subsequent rapid drop about 6.5 hours before precipitation. It can be concluded that the pattern analysis of GNSS PWV is an effective method to analyze the precursor phenomenon of precipitation.

Automated Analyses of Ground-Penetrating Radar Images to Determine Spatial Distribution of Buried Cultural Heritage (매장 문화재 공간 분포 결정을 위한 지하투과레이더 영상 분석 자동화 기법 탐색)

  • Kwon, Moonhee;Kim, Seung-Sep
    • Economic and Environmental Geology
    • /
    • v.55 no.5
    • /
    • pp.551-561
    • /
    • 2022
  • Geophysical exploration methods are very useful for generating high-resolution images of underground structures, and such methods can be applied to investigation of buried cultural properties and for determining their exact locations. In this study, image feature extraction and image segmentation methods were applied to automatically distinguish the structures of buried relics from the high-resolution ground-penetrating radar (GPR) images obtained at the center of Silla Kingdom, Gyeongju, South Korea. The major purpose for image feature extraction analyses is identifying the circular features from building remains and the linear features from ancient roads and fences. Feature extraction is implemented by applying the Canny edge detection and Hough transform algorithms. We applied the Hough transforms to the edge image resulted from the Canny algorithm in order to determine the locations the target features. However, the Hough transform requires different parameter settings for each survey sector. As for image segmentation, we applied the connected element labeling algorithm and object-based image analysis using Orfeo Toolbox (OTB) in QGIS. The connected components labeled image shows the signals associated with the target buried relics are effectively connected and labeled. However, we often find multiple labels are assigned to a single structure on the given GPR data. Object-based image analysis was conducted by using a Large-Scale Mean-Shift (LSMS) image segmentation. In this analysis, a vector layer containing pixel values for each segmented polygon was estimated first and then used to build a train-validation dataset by assigning the polygons to one class associated with the buried relics and another class for the background field. With the Random Forest Classifier, we find that the polygons on the LSMS image segmentation layer can be successfully classified into the polygons of the buried relics and those of the background. Thus, we propose that these automatic classification methods applied to the GPR images of buried cultural heritage in this study can be useful to obtain consistent analyses results for planning excavation processes.

The evaluation for the usability ofthe Varian Standard Couch modelingusing Treatment Planning System (치료계획 시스템을 이용한 Varian Standard Couch 모델링의 유용성 평가)

  • Yang, yong mo;Song, yong min;Kim, jin man;Choi, ji min;Choi, byeung gi
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.28 no.1
    • /
    • pp.77-86
    • /
    • 2016
  • Purpose : When a radiation treatment, there is an attenuation by Carbon Fiber Couch. In this study, we tried to evaluate the usability of the Varian Standard Couch(VSC) by modeling with Treatment Planning System (TPS) Materials and Methods : VSC was scanned by CBCT(Cone Beam Computed Tomography) of the Linac(Clinac IX, VARIAN, USA), following the three conditions of VSC, Side Rail OutGrid(SROG), Side Rail InGrid(SRIG), Side Rail In OutSpine Down Bar(SRIOS). After scan, the data was transferred to TPS and modeled by contouring Side Rail, Side Bar Upper, Side Bar Lower, Spine Down Bar automatically. We scanned the Cheese Phantom(Middelton, USA) using Computed Tomography(Light Speed RT 16, GE, USA) and transfer the data to TPS, and apply VSC modeled previously with TPS to it. Dose was measured at the isocenter of Ion Chamber(A1SL, Standard imaging, USA) in Cheese Phantom using 4 and 10 MV radiation for every $5^{\circ}$ gantry angle in a different filed size($3{\times}3cm^2$, $10{\times}10cm^2$) without any change of MU(=100), and then we compared the calculated dose and measured dose. Also we included dose at the $127^{\circ}$ in SRIG to compare the attenuation by Side Bar Upper. Results : The density of VSC by CBCT in TPS was $0.9g/cm^3$, and in the case of Spine Down Bar, it was $0.7g/cm^3$. The radiation was attenuated by 17.49%, 16.49%, 8.54%, and 7.59% at the Side Rail, Side Bar Upper, Side Bar Lower, and Spine Down Bar. For the accuracy of modeling, calculated dose and measured dose were compared. The average error was 1.13% and the maximum error was 1.98% at the $170^{\circ}beam$ crossing the Spine Down Bar. Conclusion : To evaluate the usability for the VSC modeled by TPS, the maximum error was 1.98% as a result of compassion between calculated dose and measured dose. We found out that VSC modeling helped expect the dose, so we think that it will be helpful for the more accurate treatment.

  • PDF

A Study on the Peripheral Dose of 6MV X-ray Beam (6 MV X선의 주변선량분포)

  • Choi, Doo-Ho;Kim, Il-Han;Ha, Sung-Whan;Park, Charn-Il
    • Journal of Radiation Protection and Research
    • /
    • v.14 no.1
    • /
    • pp.24-33
    • /
    • 1989
  • The peripheral dose, defined as the dose outside therapeutic photon fields, was estimated for 6MV X-ray linear accelerator. The measurements were performed using silicon diode detectors controlled by automatic controlled water phantom. The effects of field size, collimator position, presence or absence of wedge filter, and wedge angle were analyzed. The results were as follows 1. The peripheral dose decreases as the distance from field margin increases and it is more than 2.4% of central axis maximum dose even at 15cm distance from field margin. 2. Maximum build-up of peripheral dose is at 2-3 mm from the water surface and drops to a minimum at 1.5cm depth and then the dose increase again. 3. The peripheral dose increases as the field size. increases. At the short distance from field margin, the difference of peripheral dose between 5 $\times\;5cm^2$ and 20 $\times\;20cm^2$ field size reaches more than 2 fold. 4. The peripheral dose is higher along the upper collimator than along the lower collimator. The differences is less than 1%. 5. The presence of wedge filter increases peripheral dose. And the peripheral dose is higher along the blade side of wedge filter than along the ridge side. The difference is about 3% at 5cm distance from the field margin for 15 $\times\;15cm^2$ field size and 60$^{\circ}$ wedge filter. 6. The Peripheral dose of wedge filter increases as the wedge filter angle increases and the increasing ratio is about 2 fold in 60$^{\circ}$wedge filter compared with open field.

  • PDF

A Study on Change of Suspended Solids by Forest Road Construction(I) -Parallel Watersheds Method- (임도개설(林道開設)에 따른 부유토사량(浮遊土砂量) 변화(變化)(I) -대조유역법(對照流域法)을 중심(中心)으로-)

  • Kim, Kyoung-Jin;Chun, Kun-Woo
    • Journal of Forest and Environmental Science
    • /
    • v.10 no.1
    • /
    • pp.57-65
    • /
    • 1994
  • This study was carried out to clarify the sediment export by measuring suspended solids included in streamflow during the rainy season. The study area is located in Experimental Forests, Kangwon National University, where the forest road is under construction. For this purpose, the forest watershed with construction of forest road was compared with normal forest watershed in amount of rainfall and discharge, suspended solids and discharge, and the amount of rainfall and suspended solids. The results were shown as followings. 1. The relationship of discharge and the amount of rainfall was shown as Table 3 and Fig. 3. The delay time of peak point observed in hydrograph was changed by rainfall intensity and amount of previous rainfall. That is, when there was a rain on 12. Jun(more than 20mm/hour for hours), the peak point began three hours after the rainfall intensity over 20mm/hour, and showed $1514m^3/hour$ in automatic water level recorder. In case of the 8th of Aug.(maximum rainfall intensity: 40mm/hour), the peak point of discharge was $1246m^3/hour$ in the same time with maximum rainfall intensity. And on the 20th of Aug.(the maximum rainfall intensity: 17.2mm/hour), the peak point of discharge was $1245m^3/hour$ two hours after the maximum rainfall intensity. 2. On watershed under forest road construction, the relationship between discharge and suspended solids is that suspended solids was proportionately increased by raising discharge. That is, on the 12th of Jun, the maximum of discharge per hour was $1514m^3/hour$ and 1261mg/l of suspended solids was observed an hour after peak point of discharge. And in case of 8th and 20th Aug., each of peak points is $1246m^3/hour$ and $1245m^3/hour$ by measuring time. The maximums of suspended solids measured within two watersheds were examined in value of 4952mg/l and 472mg/l at the same time. 3. During the rainy season, the concentration of suspended solids was influenced by rainfall intensity and indicated especially curve-regressional increase in case of strong rainfall intensity. In each of watersheds, the maximums of suspended solids were 1261mg/l and 125mg/l, 4952mg/l and 44mg/l, and 472mg/l and 4mg/l by the order of rain(a), (b), and (c). Two watersheds showed a remarkable difference.

  • PDF

Consideration of density matching technique of the plate type direct radiologic image system and the conventional X-ray film;first step for the subtraction (Ektaspeed plus 필름을 이용한 일반 방사선시스템과 Digora를 이용한 디지탈 영상시스템의 밀도변화 비교연구)

  • So, Sung-Soo;Noh, Hyeun-Soo;Kim, Chang-Sung;Choi, Seong-Ho;Kim, Kee-Deog;Cho, Kyoo-Sung
    • Journal of Periodontal and Implant Science
    • /
    • v.32 no.1
    • /
    • pp.199-211
    • /
    • 2002
  • Digital substraction technique and computer-assisted densitometirc analysis detect minor change in bone density and thus increase the diagnostic accuracy. This advantage as well as high sensitivity and objectivity which precludes human bias have drawn interest in radiologic research area. The objectives of this study are to verify if Radiographic density can be recognized in linear pattern when density profile of standard periapical radiograph with the aluminium stepwedge as the reference, was investigated under varies circumstances which can be encountered in clinical situations, and in addition to that to obtain mutual relationship between the existing standard radiographic system, and future digital image systems, by confirming the corelationship between the standard radiograph and Digora system which is a digital image system currently being used. In order to make quantitative analysis of the bone tissue, digital image system which uses high resolution automatic slide scanner as an input device, and Digora system were compared and analyzed using multifunctional program, Brain3dsp. The following conclusions were obtained. 1. Under common clinical situation that is 70kVp, 0.2 sec., and focal distance 10cm, Al-Equivalent image equation was found to be Y=11.21X+46.62 $r^2=0.9898$ in standard radiographic system, and Y=12.68X+74.59, $r^2=0.9528$ in Digora system, and linear relation was confirmed in both the systems. 2. In standard radiographic system, when all conditions were maintained the same except for the condition of developing solution, Al-Equivalent image equation was Y=10.07X+41.64, $r^2=0.9861$ which shows high corelationship. 3. When all conditions were maintained the same except for the Kilovoltage peak, linear relationship was still maintained under 60kVp, and Al-Equivalent image equation was Y=14.60X+68.86, $r^2=0.9886$ in the standard radiograhic system, and Y=13.90X+80.68, $r^2=0.9238$ in Digora system. 4. When all conditions were maintained the same except for the exposure time which was varied from 0.01 sec. to 0.8 sec., Al-Equivalent image equation was found to be linear in both the standard radiographic system and Digora system. The R-square was distributed from 0.9188 to 0.9900, and in general, standard radiographic system showed higher R-square than Digora system. 5. When all conditions were maintained the same except for the focal distance which was varied from 5cm to 30cm, Al-Equivalent image equation was found to be linear in both the standard radiographic system and Digora system. The R-square was distributed from 0.9463 to 0.9925, and the standard radiographic system had the tendency to show higher R-square in shorter focal distances.

Selection Model of System Trading Strategies using SVM (SVM을 이용한 시스템트레이딩전략의 선택모형)

  • Park, Sungcheol;Kim, Sun Woong;Choi, Heung Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.2
    • /
    • pp.59-71
    • /
    • 2014
  • System trading is becoming more popular among Korean traders recently. System traders use automatic order systems based on the system generated buy and sell signals. These signals are generated from the predetermined entry and exit rules that were coded by system traders. Most researches on system trading have focused on designing profitable entry and exit rules using technical indicators. However, market conditions, strategy characteristics, and money management also have influences on the profitability of the system trading. Unexpected price deviations from the predetermined trading rules can incur large losses to system traders. Therefore, most professional traders use strategy portfolios rather than only one strategy. Building a good strategy portfolio is important because trading performance depends on strategy portfolios. Despite of the importance of designing strategy portfolio, rule of thumb methods have been used to select trading strategies. In this study, we propose a SVM-based strategy portfolio management system. SVM were introduced by Vapnik and is known to be effective for data mining area. It can build good portfolios within a very short period of time. Since SVM minimizes structural risks, it is best suitable for the futures trading market in which prices do not move exactly the same as the past. Our system trading strategies include moving-average cross system, MACD cross system, trend-following system, buy dips and sell rallies system, DMI system, Keltner channel system, Bollinger Bands system, and Fibonacci system. These strategies are well known and frequently being used by many professional traders. We program these strategies for generating automated system signals for entry and exit. We propose SVM-based strategies selection system and portfolio construction and order routing system. Strategies selection system is a portfolio training system. It generates training data and makes SVM model using optimal portfolio. We make $m{\times}n$ data matrix by dividing KOSPI 200 index futures data with a same period. Optimal strategy portfolio is derived from analyzing each strategy performance. SVM model is generated based on this data and optimal strategy portfolio. We use 80% of the data for training and the remaining 20% is used for testing the strategy. For training, we select two strategies which show the highest profit in the next day. Selection method 1 selects two strategies and method 2 selects maximum two strategies which show profit more than 0.1 point. We use one-against-all method which has fast processing time. We analyse the daily data of KOSPI 200 index futures contracts from January 1990 to November 2011. Price change rates for 50 days are used as SVM input data. The training period is from January 1990 to March 2007 and the test period is from March 2007 to November 2011. We suggest three benchmark strategies portfolio. BM1 holds two contracts of KOSPI 200 index futures for testing period. BM2 is constructed as two strategies which show the largest cumulative profit during 30 days before testing starts. BM3 has two strategies which show best profits during testing period. Trading cost include brokerage commission cost and slippage cost. The proposed strategy portfolio management system shows profit more than double of the benchmark portfolios. BM1 shows 103.44 point profit, BM2 shows 488.61 point profit, and BM3 shows 502.41 point profit after deducting trading cost. The best benchmark is the portfolio of the two best profit strategies during the test period. The proposed system 1 shows 706.22 point profit and proposed system 2 shows 768.95 point profit after deducting trading cost. The equity curves for the entire period show stable pattern. With higher profit, this suggests a good trading direction for system traders. We can make more stable and more profitable portfolios if we add money management module to the system.

A Study on the Effect of the Document Summarization Technique on the Fake News Detection Model (문서 요약 기법이 가짜 뉴스 탐지 모형에 미치는 영향에 관한 연구)

  • Shim, Jae-Seung;Won, Ha-Ram;Ahn, Hyunchul
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.201-220
    • /
    • 2019
  • Fake news has emerged as a significant issue over the last few years, igniting discussions and research on how to solve this problem. In particular, studies on automated fact-checking and fake news detection using artificial intelligence and text analysis techniques have drawn attention. Fake news detection research entails a form of document classification; thus, document classification techniques have been widely used in this type of research. However, document summarization techniques have been inconspicuous in this field. At the same time, automatic news summarization services have become popular, and a recent study found that the use of news summarized through abstractive summarization has strengthened the predictive performance of fake news detection models. Therefore, the need to study the integration of document summarization technology in the domestic news data environment has become evident. In order to examine the effect of extractive summarization on the fake news detection model, we first summarized news articles through extractive summarization. Second, we created a summarized news-based detection model. Finally, we compared our model with the full-text-based detection model. The study found that BPN(Back Propagation Neural Network) and SVM(Support Vector Machine) did not exhibit a large difference in performance; however, for DT(Decision Tree), the full-text-based model demonstrated a somewhat better performance. In the case of LR(Logistic Regression), our model exhibited the superior performance. Nonetheless, the results did not show a statistically significant difference between our model and the full-text-based model. Therefore, when the summary is applied, at least the core information of the fake news is preserved, and the LR-based model can confirm the possibility of performance improvement. This study features an experimental application of extractive summarization in fake news detection research by employing various machine-learning algorithms. The study's limitations are, essentially, the relatively small amount of data and the lack of comparison between various summarization technologies. Therefore, an in-depth analysis that applies various analytical techniques to a larger data volume would be helpful in the future.