• Title/Summary/Keyword: SIMPLEST

Search Result 438, Processing Time 0.033 seconds

인터넷 질의 처리를 위한 웨이블릿 변환에 기반한 통합 요약정보의 관리

  • Joe, Moon-Jeung;Whang, Kyu-Young;Kim, Sang-Wook;Shim, Kyu-Seok
    • Journal of KIISE:Databases
    • /
    • v.28 no.4
    • /
    • pp.702-714
    • /
    • 2001
  • As Internet technology evolves, there is growing need of Internet queries involving multiple information sources. Efficient processing of such queries necessitates the integrated summary data that compactly represents the data distribution of the entire database scattered over many information sources. This paper presents an efficient method of managing the integrated summary data based on the wavelet transform and addresses Internet query processing using the integrated summary data. The simplest method for creating the integrated summary data would be to summarize the integrated data sidtribution obtained by merging the data distributions in multiple information sources. However, this method suffers from the high cost of transmitting storing and merging a large amount of data distribution. To overcome the drawbacks, we propose a new wavelet transform based method that creates the integrated summary data by merging multiple summary data and effective method for optimizing Internet queries using it A wavelet transformed summary data is converted to satisfy conditions for merging. Moreover i the merging process is very simpe owing to the properties of the wavelet transform. we formally derive the upper bound of the error of the wavelet transformed intergrated summary data. Compared with the histogram-based integrated summary data the wavelet transformedintegrated summary data provesto be 1.6~5.5 time more accurate when used for selectivity estimation in experiments. In processing Internet top-N queries involving 56 information sources using the integrated summary data reduces the processing cost to 1/44 of the cost of not using it.

  • PDF

Extensions of X-means with Efficient Learning the Number of Clusters (X-means 확장을 통한 효율적인 집단 개수의 결정)

  • Heo, Gyeong-Yong;Woo, Young-Woon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.12 no.4
    • /
    • pp.772-780
    • /
    • 2008
  • K-means is one of the simplest unsupervised learning algorithms that solve the clustering problem. However K-means suffers the basic shortcoming: the number of clusters k has to be known in advance. In this paper, we propose extensions of X-means, which can estimate the number of clusters using Bayesian information criterion(BIC). We introduce two different versions of algorithm: modified X-means(MX-means) and generalized X-means(GX-means), which employ one full covariance matrix for one cluster and so can estimate the number of clusters efficiently without severe over-fitting which X-means suffers due to its spherical cluster assumption. The algorithms start with one cluster and try to split a cluster iteratively to maximize the BIC score. The former uses K-means algorithm to find a set of optimal clusters with current k, which makes it simple and fast. However it generates wrongly estimated centers when the clusters are overlapped. The latter uses EM algorithm to estimate the parameters and generates more stable clusters even when the clusters are overlapped. Experiments with synthetic data show that the purposed methods can provide a robust estimate of the number of clusters and cluster parameters compared to other existing top-down algorithms.

Application of Multispectral Remotely Sensed Imagery for the Characterization of Complex Coastal Wetland Ecosystems of southern India: A Special Emphasis on Comparing Soft and Hard Classification Methods

  • Shanmugam, Palanisamy;Ahn, Yu-Hwan;Sanjeevi , Shanmugam
    • Korean Journal of Remote Sensing
    • /
    • v.21 no.3
    • /
    • pp.189-211
    • /
    • 2005
  • This paper makes an effort to compare the recently evolved soft classification method based on Linear Spectral Mixture Modeling (LSMM) with the traditional hard classification methods based on Iterative Self-Organizing Data Analysis (ISODATA) and Maximum Likelihood Classification (MLC) algorithms in order to achieve appropriate results for mapping, monitoring and preserving valuable coastal wetland ecosystems of southern India using Indian Remote Sensing Satellite (IRS) 1C/1D LISS-III and Landsat-5 Thematic Mapper image data. ISODATA and MLC methods were attempted on these satellite image data to produce maps of 5, 10, 15 and 20 wetland classes for each of three contrast coastal wetland sites, Pitchavaram, Vedaranniyam and Rameswaram. The accuracy of the derived classes was assessed with the simplest descriptive statistic technique called overall accuracy and a discrete multivariate technique called KAPPA accuracy. ISODATA classification resulted in maps with poor accuracy compared to MLC classification that produced maps with improved accuracy. However, there was a systematic decrease in overall accuracy and KAPPA accuracy, when more number of classes was derived from IRS-1C/1D and Landsat-5 TM imagery by ISODATA and MLC. There were two principal factors for the decreased classification accuracy, namely spectral overlapping/confusion and inadequate spatial resolution of the sensors. Compared to the former, the limited instantaneous field of view (IFOV) of these sensors caused occurrence of number of mixture pixels (mixels) in the image and its effect on the classification process was a major problem to deriving accurate wetland cover types, in spite of the increasing spatial resolution of new generation Earth Observation Sensors (EOS). In order to improve the classification accuracy, a soft classification method based on Linear Spectral Mixture Modeling (LSMM) was described to calculate the spectral mixture and classify IRS-1C/1D LISS-III and Landsat-5 TM Imagery. This method considered number of reflectance end-members that form the scene spectra, followed by the determination of their nature and finally the decomposition of the spectra into their endmembers. To evaluate the LSMM areal estimates, resulted fractional end-members were compared with normalized difference vegetation index (NDVI), ground truth data, as well as those estimates derived from the traditional hard classifier (MLC). The findings revealed that NDVI values and vegetation fractions were positively correlated ($r^2$= 0.96, 0.95 and 0.92 for Rameswaram, Vedaranniyam and Pitchavaram respectively) and NDVI and soil fraction values were negatively correlated ($r^2$ =0.53, 0.39 and 0.13), indicating the reliability of the sub-pixel classification. Comparing with ground truth data, the precision of LSMM for deriving moisture fraction was 92% and 96% for soil fraction. The LSMM in general would seem well suited to locating small wetland habitats which occurred as sub-pixel inclusions, and to representing continuous gradations between different habitat types.

Evaluation of Water Productivity of Thailand and Improvement Measure Proposals

  • Suthidhummajit, Chokchai;Koontanakulvong, Sucharit
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2019.05a
    • /
    • pp.176-176
    • /
    • 2019
  • Thailand had issued a national strategic development master plan with issues related to water resources and water security in the entire water management. Water resources are an important factor of living and development of the country's socio-economy to be stable, prosperous and sustainable. Therefore, water management in both multidimensional and multi-sectoral systems is important and will supports socio-economic and environmental development. The direction of national development in accordance with the national strategic framework for 20 years that requires the country to level up security level in terms of water, energy and food. To response to the proposed goals, there is a subplan to increase water productivity of the entire water system for economical development use by evaluating use value and to create more value added from water use to meet international standard level. This study aims to evaluate the water productivity of Thailand in each basin and all sectors such as agricultural sector, service and industrial sectors by using the water use data from water account analysis and GDP data from NESDB during the past 10 years (1996-2015). The comparison of water productivity with other countries will also be conducted and in addition, the measures to improve water productivity in next 20 years will be explored to response to the National Strategic Master Plan goals. Water productivity is defined as output per unit of water depleted. The simplest way to compare water productivity across different enterprises is in monetary terms. World Bank presents water productivity as an indication of the efficiency by which each country uses its water resources. There are two data sets used for water productivity analyses, i.e., the first is water use data at end users and the second is Gross Domestic Product. The water use at end users are estimated by water account method based on the System of Environmental-Economic Accounting for Water (SEEA-Water) concept of United Nations. The water account shows the analyses of the water balance between the use and supply of each water resource in physical terms. The water supply and use linkage in the water account analyses separated into each phases, i.e., water sources, water managers, water service providers, water user at end user under water regulators of all kinds of water use activities such as household, industrial, agricultural, tourism, hydropower, and ecological conservation uses. The Gross Domestic Product (GDP), a well- known measuring method of the national economic growth is not actually a comprehensive approach to describe all aspects of national economic status, since GDP does not take into account the costs of the negative impacts to natural resources that result from the overexploitation of development projects, however, at present, integrating the environment with the economy of a country to measure its economic growth with GDP is acceptable worldwide. The study results will show the water use at each basin, use types at end users, water productivity in each sector from 1996-2015 compared with other countries, Besides the productivity improvement measures will be explored and proposed for the National Strategic Master Plan.

  • PDF

Phenophase Extraction from Repeat Digital Photography in the Northern Temperate Type Deciduous Broadleaf Forest (온대북부형 낙엽활엽수림의 디지털 카메라 반복 이미지를 활용한 식물계절 분석)

  • Han, Sang Hak;Yun, Chung Weon;Lee, Sanghun
    • Journal of Korean Society of Forest Science
    • /
    • v.109 no.4
    • /
    • pp.361-370
    • /
    • 2020
  • Long-term observation of the life cycle of plants allows the identification of critical signals of the effects of climate change on plants. Indeed, plant phenology is the simplest approach to detect climate change. Observation of seasonal changes in plants using digital repeat imaging helps in overcoming the limitations of both traditional methods and satellite remote sensing. In this study, we demonstrate the utility of camera-based repeat digital imaging in this context. We observed the biological events of plants and quantified their phenophases in the northern temperate type deciduous broadleaf forest of Jeombong Mountain. This study aimed to identify trends in seasonal characteristics of Quercus mongolica (deciduous broadleaf forest) and Pinus densiflora (evergreen coniferous forest). The vegetation index, green chromatic coordinate (GCC), was calculated from the RGB channel image data. The magnitude of the GCC amplitude was smaller in the evergreen coniferous forest than in the deciduous forest. The slope of the GCC (increased in spring and decreased in autumn) was moderate in the evergreen coniferous forest compared with that in the deciduous forest. In the pine forest, the beginning of growth occurred earlier than that in the red oak forest, whereas the end of growth was later. Verification of the accuracy of the phenophases showed high accuracy with root-mean-square error (RMSE) values of 0.008 (region of interest [ROI]1) and 0.006 (ROI3). These results reflect the tendency of the GCC trajectory in a northern temperate type deciduous broadleaf forest. Based on the results, we propose that repeat imaging using digital cameras will be useful for the observation of phenophases.

Risk Analysis for the Rotorcraft Landing System Using Comparative Models Based on Fuzzy (퍼지 기반 다양한 모델을 이용한 회전익 항공기 착륙장치의 위험 우선순위 평가)

  • Na, Seong Hyeon;Lee, Gwang Eun;Koo, Jeong Mo
    • Journal of the Korean Society of Safety
    • /
    • v.36 no.2
    • /
    • pp.49-57
    • /
    • 2021
  • In the case of military supplies, any potential failure and causes of failures must be considered. This study is aimed at examining the failure modes of a rotorcraft landing system to identify the priority items. Failure mode and effects analysis (FMEA) is applied to the rotorcraft landing system. In general, the FMEA is used to evaluate the reliability in engineering fields. Three elements, specifically, the severity, occurrence, and detectability are used to evaluate the failure modes. The risk priority number (RPN) can be obtained by multiplying the scores or the risk levels pertaining to severity, occurrence, and detectability. In this study, different weights of the three elements are considered for the RPN assessment to implement the FMEA. Furthermore, the FMEA is implemented using a fuzzy rule base, similarity aggregation model (SAM), and grey theory model (GTM) to perform a comparative analysis. The same input data are used for all models to enable a fair comparison. The FMEA is applied to military supplies by considering methodological issues. In general, the fuzzy theory is based on a hypothesis regarding the likelihood of the conversion of the crisp value to the fuzzy input. Fuzzy FMEA is the basic method to obtain the fuzzy RPN. The three elements of the FMEA are used as five linguistic terms. The membership functions as triangular fuzzy sets are the simplest models defined by the three elements. In addition, a fuzzy set is described using a membership function mapping the elements to the intervals 0 and 1. The fuzzy rule base is designed to identify the failure modes according to the expert knowledge. The IF-THEN criterion of the fuzzy rule base is formulated to convert a fuzzy input into a fuzzy output. The total number of rules is 125 in the fuzzy rule base. The SAM expresses the judgment corresponding to the individual experiences of the experts performing FMEA as weights. Implementing the SAM is of significance when operating fuzzy sets regarding the expert opinion and can confirm the concurrence of expert opinion. The GTM can perform defuzzification to obtain a crisp value from a fuzzy membership function and determine the priorities by considering the degree of relation and the form of a matrix and weights for the severity, occurrence, and detectability. The proposed models prioritize the failure modes of the rotorcraft landing system. The conventional FMEA and fuzzy rule base can set the same priorities. SAM and GTM can set different priorities with objectivity through weight setting.

Interface Establishment between Reinforcement Learning Algorithm and External Analysis Program for AI-based Automation of Bridge Design Process (AI기반 교량설계 프로세스 자동화를 위한 강화학습 알고리즘과 외부 해석프로그램 간 인터페이스 구축)

  • Kim, Minsu;Choi, Sanghyun
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.34 no.6
    • /
    • pp.403-408
    • /
    • 2021
  • Currently, in the design process of civil structures such as bridges, it is common to make final products by repeating the process of redesigning, if the initial design is found to not meet the standards after a structural review. This iterative process extends the design time, and causes inefficient consumption of engineering manpower, which should be put into higher-level design, on simple repetitive mechanical work. This problem can be resolved by automating the design process, but the external analysis program used in the design process has been the biggest obstacle to such automation. In this study, we constructed an AI-based automation system for the bridge design process, including an interface that could control both a reinforcement learning algorithm, and an external analysis program, to replace the repetitive tasks in the current design process. The prototype of the system built in this study was developed for a 2-span RC Rahmen bridge, which is one of the simplest bridge systems. In the future, it is expected that the developed interface system can be utilized as a basic technology for linking the latest AI with other types of bridge designs.

A Study on Comparison of Density Test Methods for Quality Control of Cement and Mineral Admixture (시멘트 및 혼화재의 품질관리를 위한 밀도 시험방법 비교 연구)

  • Jae-Seung, Lee;Sang-Kyun, Noh;Cheol, Park;Hong-Chul, Shin
    • Journal of the Korean Recycled Construction Resources Institute
    • /
    • v.10 no.4
    • /
    • pp.435-442
    • /
    • 2022
  • In this study, the density of KS L 5110 was compared with that of gas pycnometer and electronic densimeter for efficient density management of cement, blast furnace slag powder and fly ash. Correlation and usability according to the test method were reviewed, and based on the results of the experiment, the availability of alternative test methods was analyzed. As a result of the density test according to test methods, the density of cement, blast furnace slag powder and fly ash tended to decrease in the order of gas pycnometer, KS L 5110 and electronic densimeter. Because the volume range of the sample to be evaluated is different depending on test methods. The coefficient of determination R2 was in the range of 0.71 to 0.93, and the correlation according to test methods showed a relatively good correlation. If correction is applied through correlation, it is analyzed that alternative test methods can be used. As a result of the usability review considering the test procedure, measurement time and coefficient of variation, the gas pycnometer had the simplest test procedure and good reliability. In addition, it is expected that the reproducibility between the testers is relatively high because the skill is not greatly required.

Diagnosis of Location and Size of Lesions using Chest X-ray Image (X-선 영상을 이용한 암의 위치 및 크기 진단)

  • Jung-Min, Son;Byung-Ju, Ahn
    • Journal of the Korean Society of Radiology
    • /
    • v.17 no.1
    • /
    • pp.167-173
    • /
    • 2023
  • X-ray general radiography is the simplest and most important one to get a lot of information. Nevertheless, current x-ray general radiography does not observation in-depth observation. Information about the anatomy of the human body and changes in disease in x-ray general radiography can be obtained but it is difficult to determine the size and shape of the actual lesion due to the disadvantage of expanding the image. In this study, PA and LAT images were acquired and cancer magnification was calculated in the images by measuring the distance of cancer samples. By adjusting the magnification the actual cancer length and thickness were measured and compared with the CT image and the actual cancer sample size. After the PA and LAT images of the inserted 6.0 mm cancer sample were obtained and the magnification was corrected, the length was 5.9 mm and the thickness was 6.1 mm. This value was measured similarly to the actual. The problem of obtaining the magnification that needs to know the actual length from the detector to the cancer sample was secured by obtaining the magnification through PA and LAT images and it is possible to accurately measure the cancer sample size. X-ray general radiography may provide useful information in situations where CT imaging is difficult.

Development of a Method for Producing Liposome Ascorbic acid with Increased Bio-absorption (생체 흡수율이 증가된 liposomal ascorbic acid 제조법 개발)

  • Cha, Ji Hyun;Woo, Young Min;Jo, Eun Sol;Cha, Jae Young;Lee, Sang Hyeon;Lee, Keun Woo;Kim, Andre
    • Journal of Life Science
    • /
    • v.32 no.3
    • /
    • pp.232-240
    • /
    • 2022
  • Various methods are known for preparing liposomes, the simplest being the Bangham method which has been widely used. Although it is possible to produce liposomes effectively on a small experimental level with this approach, large-scale production cannot be easily performed due to difficulties in removing the organic solvent and the size of the reactor required to form the lipid film. On the other hand, emulsion can mass produce tons of liposomes with uniform particles but has the disadvantage of a significantly low capture rate. This study therefore developed an optimal liposome processing method using heat with improved capture rate and stability, and bio-absorption experiments were performed by oral administration to SD rat alongside capture rate, particle size, and zeta potential. Through the heating method, a small and uniform liposome of about 214 nm was formed and the capture rate was 38.67%, confirming that the liposome prepared by heating has a higher capture rate than the 26.46% achieved through emulsion. Comparing blood concentrations, it showed a 1.5 to 2 fold increase in all groups, gradually decreasing until 4-12 hr. The highest blood concentration of ascorbic acid powder was about 12.017 ㎍/ml, the emulsion liposome 13.871 ㎍/ml, and the heating liposome 16.322 ㎍/ml, thereby showing an improved absorption rate.