• Title/Summary/Keyword: Analysis Techniques

Search Result 10,388, Processing Time 0.041 seconds

Shape Analysis for the Activation of a Traditional Zzaim (전통짜임의 활성화를 위한 조형적 분석)

  • Namgoong, Sun
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.4
    • /
    • pp.418-426
    • /
    • 2013
  • This study aims to analyze the formativeness of zzaim (traditional craft techniques to combine two or more materials to make an angle or to weave them to make a slope side) and systematically classified these techniques to help furniture manufacturers effectively utilize them in the design of contemporary furniture. From this data, furniture manufacturers can have benefits to understand which type of zzaim techniques will be appropriate to their plan of building furniture, and practically use relevant techniques in the field. This study classified four different applications of zzaim techniques depending on sites, such as (1) the top, (2) middle body, (3) lower body, and (4) legs of furniture. In summary, zzaim techniques are differently applied depending on the application sites and formative types of furniture. This feature makes general furniture manufacturers who want to apply zzaim techniques for the first time have hard time to understand which kind of technique should be applied to which part. Recognizing this problem, this study expects general manufacturers as well as master artisans to more effectively utilize zzaim techniques by providing the systematic data on the formative analysis of types of furniture and application sites.

Recent Developments in Nuclear Forensic and Nuclear Safeguards Analysis Using Mass Spectrometry

  • Song, Kyuseok;Park, Jong-Ho;Lee, Chi-Gyu;Han, Sun-Ho
    • Mass Spectrometry Letters
    • /
    • v.7 no.2
    • /
    • pp.31-40
    • /
    • 2016
  • The analysis of nuclear materials and environmental samples is an important issue in nuclear safeguards and nuclear forensics. An analysis technique for safeguard samples has been developed for the detection of undeclared nuclear activities and verification of declared nuclear activities, while nuclear forensics has been developed to trace the origins and intended use of illicitly trafficked nuclear or radioactive materials. In these two analytical techniques, mass spectrometry has played an important role in determining the isotope ratio of various nuclides, contents of trace elements, and production dates. These two techniques typically use similar analytical instruments, but the analytical procedure and the interpretation of analytical results differ depending on the analytical purpose. The isotopic ratio of the samples is considered the most important result in an environmental sample analysis, while age dating and impurity analysis may also be important for nuclear forensics. In this review, important aspects of these techniques are compared and the role of mass spectrometry, along with recent progress in related technologies, are discussed.

Estimation of Design Rainfall by the Regional Frequency Analysis using Higher Probability Weighted Moments and GIS Techniques (고차확률가중모멘트법에 의한 지역화빈도분석과 GIS기법에 의한 설계강우량 추정)

  • Lee, Soon-Hyuk;Park, Jong-Hwa;Ryoo, Kyong-Sik;Jee, Ho-Keun;Shin, Yong-Hee
    • Proceedings of the Korean Society of Agricultural Engineers Conference
    • /
    • 2002.10a
    • /
    • pp.237-240
    • /
    • 2002
  • Design rainfall using LH-moments following the consecutive duration were derived by the regional and at-site analysis using the observed and simulated data resulted from Monte Carlo techniques. RRMSE, RBIAS and RR in RRMSE for the design rainfall were computed and compared in the regional and at-site frequency analysis. Consequently, it was shown that the regional analysis can substantially more reduce the RRMSE, RBIAS and RR in RRMSE than at-site analysis in the prediction of design rainfall. RE for an optimal order of L-moments was also computed by the methods of L, L1, L2, L3 and L4-moments for GEV distribution. It was found that the method of L-moments is more effective than the others for getting optimal design rainfall according to the regions and consecutive durations in the regional frequency analysis. Diagrams for the design rainfall derived by the regional frequency analysis using L-moments were drawn according to the regions and consecutive durations by GIS techniques.

  • PDF

A Study on Word Cloud Techniques for Analysis of Unstructured Text Data (비정형 텍스트 테이터 분석을 위한 워드클라우드 기법에 관한 연구)

  • Lee, Won-Jo
    • The Journal of the Convergence on Culture Technology
    • /
    • v.6 no.4
    • /
    • pp.715-720
    • /
    • 2020
  • In Big data analysis, text data is mostly unstructured and large-capacity, so analysis was difficult because analysis techniques were not established. Therefore, this study was conducted for the possibility of commercialization through verification of usefulness and problems when applying the big data word cloud technique, one of the text data analysis techniques. In this paper, the limitations and problems of this technique are derived through visualization analysis of the "President UN Speech" using the R program word cloud technique. In addition, by proposing an improved model to solve this problem, an efficient method for practical application of the word cloud technique is proposed.

A Study on the Classification of Variables Affecting Smartphone Addiction in Decision Tree Environment Using Python Program

  • Kim, Seung-Jae
    • International journal of advanced smart convergence
    • /
    • v.11 no.4
    • /
    • pp.68-80
    • /
    • 2022
  • Since the launch of AI, technology development to implement complete and sophisticated AI functions has continued. In efforts to develop technologies for complete automation, Machine Learning techniques and deep learning techniques are mainly used. These techniques deal with supervised learning, unsupervised learning, and reinforcement learning as internal technical elements, and use the Big-data Analysis method again to set the cornerstone for decision-making. In addition, established decision-making is being improved through subsequent repetition and renewal of decision-making standards. In other words, big data analysis, which enables data classification and recognition/recognition, is important enough to be called a key technical element of AI function. Therefore, big data analysis itself is important and requires sophisticated analysis. In this study, among various tools that can analyze big data, we will use a Python program to find out what variables can affect addiction according to smartphone use in a decision tree environment. We the Python program checks whether data classification by decision tree shows the same performance as other tools, and sees if it can give reliability to decision-making about the addictiveness of smartphone use. Through the results of this study, it can be seen that there is no problem in performing big data analysis using any of the various statistical tools such as Python and R when analyzing big data.

Advanced Offshore Pipelaying Analysis techniques Part 2 : Laybarge Methods (해저 파이프라인 가설 분석 기술)

  • Choe, Han-Seok
    • Journal of Ocean Engineering and Technology
    • /
    • v.9 no.2
    • /
    • pp.7-19
    • /
    • 1995
  • Various laybarge methods for offshore pipeline installation are introduced. Pipe stresses and strains during the installation are discussed with linear and nonlinear analysis methods. Several operational modes of offshore pipeline installation are described. Computer modelling techniques of the pipeline installation analyses are suggested.

  • PDF

REGENERATIVE BOOTSTRAP FOR SIMULATION OUTPUT ANALYSIS

  • Kim, Yun-Bae
    • Proceedings of the Korea Society for Simulation Conference
    • /
    • 2001.05a
    • /
    • pp.169-169
    • /
    • 2001
  • With the aid of fast computing power, resampling techniques are being introduced for simulation output analysis (SOA). Autocorrelation among the output from discrete-event simulation prohibit the direct application of resampling schemes (Threshold bootstrap, Binary bootstrap, Stationary bootstrap, etc) extend its usage to time-series data such as simulation output. We present a new method for inference from a regenerative process, regenerative bootstrap, that equals or exceeds the performance of classical regenerative method and approximation regeneration techniques. Regenerative bootstrap saves computation time and overcomes the problem of scarce regeneration cycles. Computational results are provided using M/M/1 model.

  • PDF

Analyzing Production Data using Data Mining Techniques (데이터마이닝 기법의 생산공정데이터에의 적용)

  • Lee H.W.;Lee G.A.;Choi S.;Bae K.W.;Bae S.M.
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2005.06a
    • /
    • pp.143-146
    • /
    • 2005
  • Many data mining techniques have been proved useful in revealing important patterns from large data sets. Especially, data mining techniques play an important role in a customer data analysis in a financial industry and an electronic commerce. Also, there are many data mining related research papers in a semiconductor industry and an automotive industry. In addition, data mining techniques are applied to the bioinformatics area. To satisfy customers' various requirements, each industry should develop new processes with more accurate production criteria. Also, they spend more money to guarantee their products' quality. In this manner, we apply data mining techniques to the production-related data such as a test data, a field claim data, and POP (point of production) data in the automotive parts industry. Data collection and transformation techniques should be applied to enhance the analysis results. Also, we classify various types of manufacturing processes and proposed an analysis scheme according to the type of manufacturing process. As a result, we could find inter- or intra-process relationships and critical features to monitor the current status of the each process. Finally, it helps an industry to raise their profit and reduce their failure cost.

  • PDF

Quantitative Proteomics Towards Understanding Life and Environment

  • Choi, Jong-Soon;Chung, Keun-Yook;Woo, Sun-Hee
    • Korean Journal of Environmental Agriculture
    • /
    • v.25 no.4
    • /
    • pp.371-381
    • /
    • 2006
  • New proteomic techniques have been pioneered extensively in recent years, enabling the high-throughput and systematic analyses of cellular proteins in combination with bioinformatic tools. Furthermore, the development of such novel proteomic techniques facilitates the elucidation of the functions of proteins under stress or disease conditions, resulting in the discovery of biomarkers for responses to environmental stimuli. The ultimate objective of proteomics is targeted toward the entire proteome of life, subcellular localization biochemical activities, and the regulation thereof. Comprehensive analysis strategies of proteomics can be classified into three categories: (i) protein separation via 2-dimensional gel electrophoresis (2-DE) or liquid chromatography (LC), (ii) protein identification via either Edman sequencing or mass spectrometry (MS), and (iii) proteome quantitation. Currently, MS-based proteomics techniques have shifted from qualitative proteome analysis via 2-DE or 2D-LC coupled with off-line matrix assisted laser desorption ionization (MALDI) and on-line electrospray ionization (ESI) MS, respectively, toward quantitative proteome analysis. In vitro quantitative proteomic techniques include differential gel electrophoresis with fluorescence dyes. protein-labeling tagging with isotope-coded affinity tags, and peptide-labeling tagging with isobaric tags for relative and absolute quantitation. In addition, stable isotope-labeled amino acids can be in vivo labeled into live culture cells via metabolic incorporation. MS-based proteomics techniques extend to the detection of the phosphopeptide mapping of biologically crucial proteins, which ale associated with post-translational modification. These complementary proteomic techniques contribute to our current understanding of the manner in which life responds to differing environment.

Empirical Analysis of the Feeling of Shooting in 2D Shooting Games (2차원 슈팅 게임에서의 타격감에 대한 실험적 분석)

  • Seo, Jin-Seok;Kim, Nam-Gyu
    • Journal of the Korea Society of Computer and Information
    • /
    • v.15 no.2
    • /
    • pp.75-81
    • /
    • 2010
  • Feeling of shooting is one of the most important features of shooting games. Game developers have tried to improve feeling of shooting by using various techniques, such as visual/sound effects, rumble effects, animations, and camera techniques. In this paper, we introduce the results of the empirical analysis of the several techniques in a 2D shooting game. We carried out two experiments in which levels of feeling of shooting were measured in a simple 2D shooting game. The first experiment was configured with 16 combinations of the four techniques (visual, animation, sound, and rumble effects) applied to a shooting object (a cannon), and the second was configured with 16 combinations of the two techniques (visual and sound effects) applied to both or either side of a shooting object and exploding objects (enemy ships). The analysis results of the experiments showed that all of each techniques were statistically significant factors. We could also found that sound effects and rumble effects are more effective than visual effects and animations and that exploding objects are more important that a shooting object.