• Title/Summary/Keyword: Algebraic Method

Search Result 614, Processing Time 0.028 seconds

Performance Improvements of SCAM Climate Model using LAPACK BLAS Library (SCAM 기상모델의 성능향상을 위한 LAPACK BLAS 라이브러리의 활용)

  • Dae-Yeong Shin;Ye-Rin Cho;Sung-Wook Chung
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.16 no.1
    • /
    • pp.33-40
    • /
    • 2023
  • With the development of supercomputing technology and hardware technology, numerical computation methods are also being advanced. Accordingly, improved weather prediction becomes possible. In this paper, we propose to apply the LAPACK(Linear Algebra PACKage) BLAS(Basic Linear Algebra Subprograms) library to the linear algebraic numerical computation part within the source code to improve the performance of the cumulative parametric code, Unicon(A Unified Convection Scheme), which is included in SCAM(Single-Columns Atmospheric Model, simplified version of CESM(Community Earth System Model)) and performs standby operations. In order to analyze this, an overall execution structure diagram of SCAM was presented and a test was conducted in the relevant execution environment. Compared to the existing source code, the SCOPY function achieved 0.4053% performance improvement, the DSCAL function 0.7812%, and the DDOT function 0.0469%, and all of them showed a 0.8537% performance improvement. This means that the LAPACK BLAS application method, a library for high-density linear algebra operations proposed in this paper, can improve performance without additional hardware intervention in the same CPU environment.

An Efficient Post-Quantum Signature Scheme Based on Multivariate-Quadratic Equations with Shorter Secret Keys (양자컴퓨터에 안전한 짧은 비밀키를 갖는 효율적인 다변수 이차식 기반 전자서명 알고리즘 설계)

  • Kyung-Ah Shim
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.33 no.2
    • /
    • pp.211-222
    • /
    • 2023
  • Multivariate quadratic equations (MQ)-based public-key cryptographic algorithms are one of promising post-quantumreplacements for currently used public-key cryptography. After selecting to NIST Post-Quantum Cryptography StandardizationRound 3 as one of digital signature finalists, Rainbow was cryptanalyzed by advanced algebraic attacks due to its multiple layered structure. The researches on MQ-based schemes are focusing on UOV with a single layer. In this paper, we propose a new MQ-signature scheme based on UOV using the combinations of the special structure of linear equations, spare polynomials and random polynomials to reduce the secret key size. Our scheme uses the block inversion method using half-sized blockmatrices to improve signing performance. We then provide security analysis, suggest secure parameters at three security levels and investigate their key sizes and signature sizes. Our scheme has the shortest signature length among post-quantumsignature schemes based on other hard problems and its secret key size is reduced by up to 97% compared to UOV.

The prediction of deformation according to tunnel excavation in weathered granite (화강 풍화암지반의 터널굴착에 따른 변형예측)

  • Cha, Bong-Geun;Kim, Young-Su;Kwo, Tae-Soon;Kim, Sung-Ho
    • Journal of Korean Tunnelling and Underground Space Association
    • /
    • v.12 no.4
    • /
    • pp.329-340
    • /
    • 2010
  • Mechanical behavior of underground cavity construction such as tunnel is very difficult to estimate due to complexity and uncertainty of ground. Prediction of behavior according to excavation of tunnel mainly uses method utilized of model test or numerical analysis. But scale model test is difficult to reappear field condition, numerical analysis is also very hard to seek choice of suitable constituent model and input data. To solve this problem, this paper forecasted the deformation of tunnel that applied to information of crown settlement and convergence, RMR in weathered granite by using the regression analysis. The result of the analysis shows that the crown settlement according to excavation occurs approximately 70~80% of total displacements within about 20 days. As a result of the prediction of crown settlement and convergence, an exponential function becomes more accurate at measurements than an algebraic function. Also this paper got a correlation in comparison of RMR and displacements of 6 sections.

Design of a High-Speed Data Packet Allocation Circuit for Network-on-Chip (NoC 용 고속 데이터 패킷 할당 회로 설계)

  • Kim, Jeonghyun;Lee, Jaesung
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2022.10a
    • /
    • pp.459-461
    • /
    • 2022
  • One of the big differences between Network-on-Chip (NoC) and the existing parallel processing system based on an off-chip network is that data packet routing is performed using a centralized control scheme. In such an environment, the best-effort packet routing problem becomes a real-time assignment problem in which data packet arriving time and processing time is the cost. In this paper, the Hungarian algorithm, a representative computational complexity reduction algorithm for the linear algebraic equation of the allocation problem, is implemented in the form of a hardware accelerator. As a result of logic synthesis using the TSMC 0.18um standard cell library, the area of the circuit designed through case analysis for the cost distribution is reduced by about 16% and the propagation delay of it is reduced by about 52%, compared to the circuit implementing the original operation sequence of the Hungarian algorithm.

  • PDF

Analysis on letter and expressions in the elementary mathematics textbooks (초등수학 교과서에 제시된 문자와 식 내용 분석 -6차와 2007년 교육과정을 중심으로-)

  • Kim, Sung Ae;Kim, Sung Joon
    • Journal of Elementary Mathematics Education in Korea
    • /
    • v.17 no.1
    • /
    • pp.105-128
    • /
    • 2013
  • One of the biggest changes in 2007 Curriculum Revision is introduction of letter, equation, direct proportion and inverse proportion in fifth and sixth grade of mathematics. The purpose of this study is to provide some implications about teaching-learning method for introduction of letters, teaching and learning activities of equation between the 6th Curriculum and 2007 Curriculum Revision. The below conclusions were drawn from findings obtained in this study. First, the letter and expression were learned in fifth and sixth grade until 6th Curriculum and were learned in seventh grade in middle school of 7th Curriculum. But letter, equation are introduced in 2007 Curriculum Revision again. The overall contents of letter and expression were learned on the 'Relationship' domain in the 6th Curriculum, it were learned on the 'Letter and expression' domain in the 7th Curriculum and is learned on the 'Regularity and problem-solving' domain in the 2007 Curriculum Revision. Second, teaching method of these contents was to promise some definitions at first and then to solve exercises in the 6th Curriculum. But leaning was forced to improve student's problem-solving in the 7th Curriculum. To reduce student's pressure offers at a minimum mathematics terms and to provide problem situations to students who contact daily, it is emphasized on learner's communication in the 2007 Curriculum Revision. We want to be easily connected elementary mathematics and higher mathematics through this study about letter, equation. We recognized how we teach the letter and expression to reduce misconceptions and draw a transition from arithmetic thinking to algebraic thinking and want to be continue of another studies.

  • PDF

Analysis for A Partial Distribution Loaded Orthotropic Rectangular Plate with Various Boundary Condition (다양한 경계조건에서 부분 분포 하중을 받는 이방성 사각평판 해석)

  • See, Sangkwang
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.22 no.5
    • /
    • pp.13-22
    • /
    • 2018
  • In this study, a governing differential equation for the bending problem of orthotropic rectangular plate is drived. It's exact solution for various boundary conditions is presented. This solution follows traditional method like Navier's solution or Levy's solution that transforms the governing differential equation into an algebraic equation by using trigonometric series. To obtain a solution by Levy's method, it is required that two opposite edges of the plate be simply supported. And the boundary conditions, for which the Navier's method is applicable, are simply supported edge at all edges. In this study, it overcomes the limitations of the previous Navier's and Levy's methods.This solution is applicable for any combination of boundary conditions with simply supported edge and clamped edge in x, y direction. The plate could be subjected to uniform, partially uniform, and line loads. The advantage of the solution is that it is the exact solution as well as it overcomes the limitations of the previous Navier's and Levy's methods. Calculations are presented for orthotropic plates with nonsymmetric boundary conditions. Comparisons between the result of this paper and the result of Navier, Levy and Szilard solutions are made for the isotropic plates. The deflections were in excellent agreement.

Conjugate Simulation of Heat Transfer and Ablation in a Small Rocket Nozzle (소형 시험모터의 노즐 열전달 및 삭마 통합해석)

  • Bae, Ji-Yeul;Kim, Taehwan;Kim, Ji Hyuk;Ham, Heecheol;Cho, Hyung Hee
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.30 no.2
    • /
    • pp.119-125
    • /
    • 2017
  • Ablative material in a rocket nozzle is exposed to high temperature combustion gas, thus undergoes complicated thermal/chemical change in terms of chemical destruction of surface and thermal decomposition of inner material. Therefore, method for conjugate analysis of thermal response inside carbon/phenolic material including rocket nozzle flow, surface chemical reaction and thermal decomposition is developed in this research. CFD is used to simulate flow field inside nozzle and conduction in the ablative material. A change in material density and a heat absorption caused by the thermal decomposition is considered in solid energy equation. And algebraic equation under boundary layer assumption is used to deduce reaction rate on the surface and resulting destruction of the surface. In order to test the developed method, small rocket nozzle is solved numerically. Although the ablation of nozzle throat is deduced to be higher than the experiment, shape change and temperature distribution inside material is well predicted. Error in temperature with experimental results in rapid heating region is found to be within 100 K.

A Comparison of Pre-Service Teachers' and Students' Understanding of the Concept of Parameters as Means of Generalization (일반화 수단으로서 매개변수의 인식과 오류에 대한 연구 -중학교 2학년 학생들과 예비교사들의 인식과 오류를 중심으로-)

  • Jee, Young Myong;Yoo, Yun Joo
    • School Mathematics
    • /
    • v.16 no.4
    • /
    • pp.803-825
    • /
    • 2014
  • From the early stages of learning algebra, literal symbols are used to represent algebraic objects such as variables and parameters. The concept of parameters contains both indeterminacy and fixity resulting in confusion and errors in understanding. The purpose of this research is to compare the beginners of algebra and pre-service teachers who completed secondary mathematics education in terms of understanding this paradoxical nature of parameters. We recruited 35 middle school students in eight grade and 73 pre-service teachers enrolled in a undergraduate course at one university. Using them we conducted a survey on the perception of the nature of parameters asking if one considers parameters suggested in a problem as variables or constants. We analyzed the collected data using the mixed method of qualitative and quantitative approaches. From the analysis results, we identified several difficulties in understanding of parameters from both groups. Especially, our statistical analysis revealed that the proportions of subjects with limited understanding of the concept of parameters do not differ much in two groups. This suggests that learning algebra in secondary mathematics education does not improve the understanding of the nature of parameters significantly.

  • PDF

Multi-Vector Document Embedding Using Semantic Decomposition of Complex Documents (복합 문서의 의미적 분해를 통한 다중 벡터 문서 임베딩 방법론)

  • Park, Jongin;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.19-41
    • /
    • 2019
  • According to the rapidly increasing demand for text data analysis, research and investment in text mining are being actively conducted not only in academia but also in various industries. Text mining is generally conducted in two steps. In the first step, the text of the collected document is tokenized and structured to convert the original document into a computer-readable form. In the second step, tasks such as document classification, clustering, and topic modeling are conducted according to the purpose of analysis. Until recently, text mining-related studies have been focused on the application of the second steps, such as document classification, clustering, and topic modeling. However, with the discovery that the text structuring process substantially influences the quality of the analysis results, various embedding methods have actively been studied to improve the quality of analysis results by preserving the meaning of words and documents in the process of representing text data as vectors. Unlike structured data, which can be directly applied to a variety of operations and traditional analysis techniques, Unstructured text should be preceded by a structuring task that transforms the original document into a form that the computer can understand before analysis. It is called "Embedding" that arbitrary objects are mapped to a specific dimension space while maintaining algebraic properties for structuring the text data. Recently, attempts have been made to embed not only words but also sentences, paragraphs, and entire documents in various aspects. Particularly, with the demand for analysis of document embedding increases rapidly, many algorithms have been developed to support it. Among them, doc2Vec which extends word2Vec and embeds each document into one vector is most widely used. However, the traditional document embedding method represented by doc2Vec generates a vector for each document using the whole corpus included in the document. This causes a limit that the document vector is affected by not only core words but also miscellaneous words. Additionally, the traditional document embedding schemes usually map each document into a single corresponding vector. Therefore, it is difficult to represent a complex document with multiple subjects into a single vector accurately using the traditional approach. In this paper, we propose a new multi-vector document embedding method to overcome these limitations of the traditional document embedding methods. This study targets documents that explicitly separate body content and keywords. In the case of a document without keywords, this method can be applied after extract keywords through various analysis methods. However, since this is not the core subject of the proposed method, we introduce the process of applying the proposed method to documents that predefine keywords in the text. The proposed method consists of (1) Parsing, (2) Word Embedding, (3) Keyword Vector Extraction, (4) Keyword Clustering, and (5) Multiple-Vector Generation. The specific process is as follows. all text in a document is tokenized and each token is represented as a vector having N-dimensional real value through word embedding. After that, to overcome the limitations of the traditional document embedding method that is affected by not only the core word but also the miscellaneous words, vectors corresponding to the keywords of each document are extracted and make up sets of keyword vector for each document. Next, clustering is conducted on a set of keywords for each document to identify multiple subjects included in the document. Finally, a Multi-vector is generated from vectors of keywords constituting each cluster. The experiments for 3.147 academic papers revealed that the single vector-based traditional approach cannot properly map complex documents because of interference among subjects in each vector. With the proposed multi-vector based method, we ascertained that complex documents can be vectorized more accurately by eliminating the interference among subjects.

A Preliminary Quantification of $^{99m}Tc$-HMPAO Brain SPECT Images for Assessment of Volumetric Regional Cerebral Blood Flow ($^{99m}Tc$-HMPAO 뇌혈류 SPECT 영상의 부위별 체적 혈류 평가에 관한 기초 연구)

  • Kwark, Cheol-Eun;Park, Seok-Gun;Yang, Hyung-In;Choi, Chang-Woon;Lee, Kyung-Han;Lee, Dong-Soo;Chung, June-Key;Lee, Myung-Chul;Koh, Chang-Soon
    • The Korean Journal of Nuclear Medicine
    • /
    • v.27 no.2
    • /
    • pp.170-174
    • /
    • 1993
  • The quantitative methods for the assessment of the cerebral blood flow using $^{99m}Tc$-HMPAO brain SPECT utilize the measured count distribution in some specific reconstructed tomographic slice or in algebraic summation of a few neighboring slices, rather than the true volumetric distribution, to estimate the relative regional cerebral blood flow, and consequently produce the biased estimates of the true regional cerebral blood flow. This kind of biases are thought to originate mainly from the arbitrarily irregular shape of the cerebral region of interest(ROI) which are analyzed. In this study, a semi-automated method for the direct quantification of the volumetric regional cerebral blood flow estimate is proposed, and the results are compared to those calculated by the previous planar approaches. Bias factors due to the partial volume effect and the uncertainty in ROI determination are not considered presently for the methodological comparison of planar/volumetric assessment protocol.

  • PDF