• Title/Summary/Keyword: 3-Dimensionality

Search Result 164, Processing Time 0.025 seconds

Design and Performance Analysis of a Parallel Cell-Based Filtering Scheme using Horizontally-Partitioned Technique (수평 분할 방식을 이용한 병렬 셀-기반 필터링 기법의 설계 및 성능 평가)

  • Chang, Jae-Woo;Kim, Young-Chang
    • The KIPS Transactions:PartD
    • /
    • v.10D no.3
    • /
    • pp.459-470
    • /
    • 2003
  • It is required to research on high-dimensional index structures for efficiently retrieving high-dimensional data because an attribute vector in data warehousing and a feature vector in multimedia database have a characteristic of high-dimensional data. For this, many high-dimensional index structures have been proposed, but they have so called ‘dimensional curse’ problem that retrieval performance is extremely decreased as the dimensionality is increased. To solve the problem, the cell-based filtering (CBF) scheme has been proposed. But the CBF scheme show a linear decreasing on performance as the dimensionality. To cope with the problem, it is necessary to make use of parallel processing techniques. In this paper, we propose a parallel CBF scheme which uses a horizontally-partitioned technique as declustering. In order to maximize the retrieval performance of the proposed parallel CBF scheme, we construct our parallel CBF scheme under a SN (Shared Nothing) cluster architecture. In addition, we present a data insertion algorithm, a rage query processing one, and a k-NN query processing one which are suitable for the SN cluster architecture. Finally, we show that our parallel CBF scheme achieves better retrieval performance in proportion to the number of servers in the SN cluster architecture, compared with the conventional CBF scheme.

The Study of the Validity Test on the Self-monitoring Scale (자기 검색척도(Self-Monitoring Scale)의 타당성 검정에 관한 연구)

  • 이선아
    • Journal of Korean Academy of Nursing
    • /
    • v.28 no.3
    • /
    • pp.751-759
    • /
    • 1998
  • The study of the validity test on the self-monitoring scale for nurses In this study, both the literary survey as well as empirical research has been executed to test the validity of the scales that measure the construct of the self-monitoring scale. The self-monitoring scale could not be classified into five factors as Snyder suggested. Many other scholars (Briggs, Cheek and Buss, 1980) suggested 3 different classifications which was accepted by Snyder and Gangestad (1986). John, Cheek and Klohnen(1996) claimed a two-factor classification. As has been discussed, factor analysis is used to prove convergent validity within the factor and discriminant validity between the factors. However, depending on the researchers, many variations in classification of the factors were found and a lack of content and discriminant validity were found in the previous research findings. It is also important to note that Snyder's self-monitoring scale did not factor-load at over. 30 for all 25 items, regardless of how many factors could be classified. According to findings of this study, the self-monitoring scale neither classified as five, three or two factors nor factor loaded as hypothesized. It is also clear that Snyder's self-monitoring scale lacks convergent validity as the sub-factors of the scale failed to prove its uni-dimensionality. The A self-monit oring scale not only fail to overcome the problems of Snyder's self-monitori ng scale but even lost the attractiveness of the self-monitoring scale. In this study it was also found that the A self-monitoring scale was not classified in either in a two or three-factor classification as hypothesized. It is, of course, not desirable to use any scale that lacks convergent and discriminant validity even though it has been widely used and has held a great deal of influence on the field of social psychology. To overcome the shortcomings of Snyder's self-monitoring scale, Lennox and Wolfe(1984) suggested 13 items. This study was dedicated to test the validity and reliability of the scale, in which we found that the data presented in validity as the two factors were class ified and loaded as expected. Reliability was also proven by checking Cronbach's α for each factor and for the total items. In addition, a confirmatory factor analysis was executed for the 13 items using LISREL 8.12 program to confirm convergent validity in a two-factor classification. The model was fitting and sound : however, the self-monitoring scale was unfitted and not validated. Thus, it is recommended to use not the original nor the abbreviated self-monitoring scale but the 13 items in future studies. It should also be noted that items 7 and 13 should be removed to obtain better uni-dimensionality for the 13 items. These items loaded at over. 30, too high for the two factors in the test results of Factor analysis. In addition, it is necessary to double-check the cause of two-hold loading at over .30 for the two factors. It could be a problem caused by data or by the scale itself. Therefore, additional studies should follow to better clarify this matter.

  • PDF

An Adversarial Attack Type Classification Method Using Linear Discriminant Analysis and k-means Algorithm (선형 판별 분석 및 k-means 알고리즘을 이용한 적대적 공격 유형 분류 방안)

  • Choi, Seok-Hwan;Kim, Hyeong-Geon;Choi, Yoon-Ho
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.31 no.6
    • /
    • pp.1215-1225
    • /
    • 2021
  • Although Artificial Intelligence (AI) techniques have shown impressive performance in various fields, they are vulnerable to adversarial examples which induce misclassification by adding human-imperceptible perturbations to the input. Previous studies to defend the adversarial examples can be classified into three categories: (1) model retraining methods; (2) input transformation methods; and (3) adversarial examples detection methods. However, even though the defense methods against adversarial examples have constantly been proposed, there is no research to classify the type of adversarial attack. In this paper, we proposed an adversarial attack family classification method based on dimensionality reduction and clustering. Specifically, after extracting adversarial perturbation from adversarial example, we performed Linear Discriminant Analysis (LDA) to reduce the dimensionality of adversarial perturbation and performed K-means algorithm to classify the type of adversarial attack family. From the experimental results using MNIST dataset and CIFAR-10 dataset, we show that the proposed method can efficiently classify five tyeps of adversarial attack(FGSM, BIM, PGD, DeepFool, C&W). We also show that the proposed method provides good classification performance even in a situation where the legitimate input to the adversarial example is unknown.

Performance Comparison of Anomaly Detection Algorithms: in terms of Anomaly Type and Data Properties (이상탐지 알고리즘 성능 비교: 이상치 유형과 데이터 속성 관점에서)

  • Jaeung Kim;Seung Ryul Jeong;Namgyu Kim
    • Journal of Intelligence and Information Systems
    • /
    • v.29 no.3
    • /
    • pp.229-247
    • /
    • 2023
  • With the increasing emphasis on anomaly detection across various fields, diverse anomaly detection algorithms have been developed for various data types and anomaly patterns. However, the performance of anomaly detection algorithms is generally evaluated on publicly available datasets, and the specific performance of each algorithm on anomalies of particular types remains unexplored. Consequently, selecting an appropriate anomaly detection algorithm for specific analytical contexts poses challenges. Therefore, in this paper, we aim to investigate the types of anomalies and various attributes of data. Subsequently, we intend to propose approaches that can assist in the selection of appropriate anomaly detection algorithms based on this understanding. Specifically, this study compares the performance of anomaly detection algorithms for four types of anomalies: local, global, contextual, and clustered anomalies. Through further analysis, the impact of label availability, data quantity, and dimensionality on algorithm performance is examined. Experimental results demonstrate that the most effective algorithm varies depending on the type of anomaly, and certain algorithms exhibit stable performance even in the absence of anomaly-specific information. Furthermore, in some types of anomalies, the performance of unsupervised anomaly detection algorithms was observed to be lower than that of supervised and semi-supervised learning algorithms. Lastly, we found that the performance of most algorithms is more strongly influenced by the type of anomalies when the data quantity is relatively scarce or abundant. Additionally, in cases of higher dimensionality, it was noted that excellent performance was exhibited in detecting local and global anomalies, while lower performance was observed for clustered anomaly types.

Validation of the Korean Functional Gait Assessment in Patients With Stroke (뇌졸중 환자를 대상으로 실시한 한글판 기능적 보행평가의 타당도)

  • Park, So-yeon
    • Physical Therapy Korea
    • /
    • v.23 no.2
    • /
    • pp.35-43
    • /
    • 2016
  • Background: The Functional Gait Assessment (FGA) was developed to measure of gait-related activities. The FGA was translated in Korean but only a few psychometric characteristics had been studied. Objects: The purpose of this study was to evaluate the validity and reliability of the Korean version of FGA scale using Rasch analysis. Methods: The study included 120 patients with stroke (age range=30~83 years; mean${\pm}$standard deviation=$58.3{\pm}11.1$). The FGA and Berg Balance Scale were performed, and were analysed for dimensionality of the scale, item difficulty, scale reliability and separation, and item-person map using Rasch analysis. Results: The 4 rating scale categories of FGA were satisfied with optimal rating scale criteria. The most items of the FGA showed sound item psychometric properties except 2 items ('gait with the horizontal head turns', and 'gait with narrow base of support'), and the 2 misfit items were excluded for all further analyses. The 8 items were arranged in order of difficulty. The most difficult item was 'gait with eyes closed', the middle difficult item was 'gait level surface', and the easiest item was 'gait with vertical head turns.' A person separation reliability was .93 and the person separation index was 3.57. Conclusion: This study suggests that the 8-item Korean FGA are valid measure of assess the gait-related balance performance, and to set the goal of rehabilitation plan in patient with stroke.

A numerical study on the seepage failure by heave in sheeted excavation pits

  • Koltuk, Serdar;Fernandez-Steeger, Tomas M.;Azzam, Rafig
    • Geomechanics and Engineering
    • /
    • v.9 no.4
    • /
    • pp.513-530
    • /
    • 2015
  • Commonly, the base stability of sheeted excavation pits against seepage failure by heave is evaluated by using two-dimensional groundwater flow models and Terzaghi's failure criterion. The objective of the present study is to investigate the effect of three-dimensional groundwater flow on the heave for sheeted excavation pits with various dimensions. For this purpose, the steady-state groundwater flow analyses are performed by using the finite element program ABAQUS 6.12. It has been shown that, in homogeneous soils depending on the ratio of half of excavation width to embedment depth b/D, the ratio of safety factor obtained from 3D analyses to that obtained from 2D analyses $FS_{(3D)}/FS_{(2D)}$ can reach up to 1.56 and 1.34 for square and circular shaped excavations, respectively. As failure body, both an infinitesimal soil column adjacent to the wall (Baumgart & Davidenkoff's criterion) and a three-dimensional failure body with the width suggested by Terzaghi for two-dimensional cases are used. It has been shown that the ratio of $FS_{(Terzaghi)}/FS_{(Davidenkoff)}$ varies between 0.75 and 0.94 depending on the ratio of b/D. Additionally, the effects of model size, the shape of excavation pit and anisotropic permeability on the heave are studied. Finally, the problem is investigated for excavation pits in stratified soils, and important points are emphasized.

Three-dimensional flow characteristics and heat transfer to a circular cylinder with a hot circular impinging air jet (원형 실린더에 충돌하는 고온 제트의 3차원 유동 특성 및 열전달)

  • Hong, Gi-Hyeok;Gang, Sin-Hyeong
    • Transactions of the Korean Society of Mechanical Engineers B
    • /
    • v.21 no.2
    • /
    • pp.285-293
    • /
    • 1997
  • Numerical calculations has been performed for the flow and heat transfer to a circular cylinder from a hot circular impinging air jet. The characteristics of the flow and heat transfer are investigated and compared with the two-dimensional flow. The present study lays emphasis on the investigation on the flow and heat transfer of the three-dimensionality. The effects of the buoyancy force and the size of jet are also studied. The noticeable difference between the three and the two-dimensional cases is that there is axial flow of low temperature into the center-plane of the cylinder from the outside in the recirculation region. Local Nusselt number over the cylinder surface has higher value for the large jet as compared with that of the small jet since the energy loss of hot jet to the ambient air decreases with increase of the jet size. As buoyancy force increases the flow accelerates so that the period of cooling by the ambient air is reduced, which results in higher local Nusselt number over the surface.

An Collaborative Filtering Method based on Associative Cluster Optimization for Recommendation System (추천시스템을 위한 연관군집 최적화 기반 협력적 필터링 방법)

  • Lee, Hyun Jin;Jee, Tae Chang
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.6 no.3
    • /
    • pp.19-29
    • /
    • 2010
  • A marketing model is changed from a customer acquisition to customer retention and it is being moved to a way that enhances the quality of customer interaction to add value to our customers. Such personalization is emerging from this background. The Web site is accelerate the adoption of a personalization, and in contrast to the rapid growth of data, quantitative analytical experience is required. For the automated analysis of large amounts of data and the results must be passed in real time of personalization has been interested in technical problems. A recommendation algorithm is an algorithm for the implementation of personalization, which predict whether the customer preferences and purchasing using the database with new customers interested or likely to purchase. As recommended number of users increases, the algorithm increases recommendation time is the problem. In this paper, to solve this problem, a recommendation system based on clustering and dimensionality reduction is proposed. First, clusters customers with such an orientation, then shrink the dimensions of the relationship between customers to low dimensional space. Because finding neighbors for recommendations is performed at low dimensional space, the computation time is greatly reduced.

Improvement Depth Perception of Volume Rendering using Virtual Reality (가상현실을 통한 볼륨렌더링 깊이 인식 향상)

  • Choi, JunYoung;Jeong, HaeJin;Jeong, Won-Ki
    • Journal of the Korea Computer Graphics Society
    • /
    • v.24 no.2
    • /
    • pp.29-40
    • /
    • 2018
  • Direct volume rendering (DVR) is a commonly used method to visualize inner structures in 3D volumetric datasets. However, conventional volume rendering on a 2D display lacks depth perception due to dimensionality reduction caused by ray casting. In this work, we investigate how emerging Virtual Reality (VR) can improve the usability of direct volume rendering. We developed real-time high-resolution DVR system in virtual reality, and measures the usefulness of volume rendering with improved depth perception via a user study conducted by 38 participants. The result indicates that virtual reality significantly improves the usability of DVR by allowing better depth perception.

Three-dimensional Distortion-tolerant Object Recognition using Computational Integral Imaging and Statistical Pattern Analysis (집적 영상의 복원과 통계적 패턴분석을 이용한 왜곡에 강인한 3차원 물체 인식)

  • Yeom, Seok-Won;Lee, Dong-Su;Son, Jung-Young;Kim, Shin-Hwan
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.34 no.10B
    • /
    • pp.1111-1116
    • /
    • 2009
  • In this paper, we discuss distortion-tolerant pattern recognition using computational integral imaging reconstruction. Three-dimensional object information is captured by the integral imaging pick-up process. The captured information is numerically reconstructed at arbitrary depth-levels by averaging the corresponding pixels. We apply Fisher linear discriminant analysis combined with principal component analysis to computationally reconstructed images for the distortion-tolerant recognition. Fisher linear discriminant analysis maximizes the discrimination capability between classes and principal component analysis reduces the dimensionality with the minimum mean squared errors between the original and the restored images. The presented methods provide the promising results for the classification of out-of-plane rotated objects.