• Title/Summary/Keyword: software algorithms

Search Result 1,093, Processing Time 0.025 seconds

Noisy Power Quality Recognition System using Wavelet based Denoising and Neural Networks (웨이블릿 기반 잡음제거와 신경회로망을 이용한 잡음 전력 품질 인식 시스템)

  • Chong, Won-Yong
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.13 no.2
    • /
    • pp.91-98
    • /
    • 2012
  • Power Quality (PQ) signal such as sag, swell, harmonics, and impulsive transients are the major issues in the operations of the power electronics based devices and microprocessor based equipments. The effectiveness of wavelet based denoising techniques and recognizing different power quality events with noise has been presented in this paper. The algorithms involved in the noisy PQ recognition system are the wavelet based denoising and the back propagation neural networks. Also, in order to verify the real-time performances of the noisy PQ recognition systems under the noisy environments, SIL(Software In the Loop) and PIL(Processor In the Loop) were carried out, resulting in the excellent recognition performances.

A Survey of the Application of Blockchain in Multiple Fields of Financial Services

  • Wang, Yiran;Kim, Dae-Kyoo;Jeong, Dongwon
    • Journal of Information Processing Systems
    • /
    • v.16 no.4
    • /
    • pp.935-958
    • /
    • 2020
  • The core value of finance is credit. It can be said that without credit, there can be no finance. The distributed structure of the blockchain and the low-cost trust-building mechanism based on mathematical algorithms provide a new solution and path for solving and optimizing related problems in the financial field. The blockchain technology is applied in the development of the financial industry through consensus mechanisms, smart contracts, and distributed networks. In this research, a comprehensive survey of the blockchain technology is proposed in the development of financial services including equity crowdfunding and credit investigations in inclusive finance, cross-border remittance, Internet financial payment, P2P lending, supply chains finance, and the application of blockchain in the field of anti-money laundering. This paper discusses the role of blockchain in solutions to different issues in the financial field. It also discusses the architectures in different financial service application scenarios from the perspective of the financial trust mechanism and the perspective of the technology and rule change of blockchain participation in financial innovation. Finally, the problems and challenges of blockchain in financial services are discussed, and corresponding solutions are proposed.

Development of Transmission Simulator for High-Speed Tracked Vehicles (고속 무한궤도 차량용 변속기 시뮬레이터 개발)

  • Jung, Gyuhong
    • Journal of Drive and Control
    • /
    • v.14 no.4
    • /
    • pp.29-36
    • /
    • 2017
  • Electronic control technologies that have long been developed for passenger cars spread to construction equipment and agricultural vehicles because of its outstanding performance achieved by embedded software. Especially, system program of transmission control unit (TCU) plays a crucial role for the superb shift quality, driving performance and fuel efficiency, etc. Since the control algorithm is embedded in software that is rarely analyzed, development of such a TCU cannot be conducted by conventional reverse engineering. Transmission simulator is a kind of electronic device that simulates the electric signals including driver operation command and output of various sensors installed in transmission. Standalone TCU can be run in normal operation mode with the signals provided by transmission simulator. In this research, transmission simulator for the tracked vehicle TCU is developed for the analysis of shift control algorithm from the experiments with standalone TCU. It was confirmed that shift experimental data for the simulator setup conditions can be used for the analysis of control algorithms on proportional solenoid valves and shift map.

A Study on Regular Grid Based Real-Time Terrain LOD Algorithm for Enhancing Memory Efficiency (메모리 효율 향상을 위한 고정격자기반 실시간 지형 LOD 알고리즘에 관한 연구)

  • Whangbo Taeg-keun;Yang Young-Kyu;Moon Min-Soo
    • Korean Journal of Remote Sensing
    • /
    • v.20 no.6
    • /
    • pp.409-418
    • /
    • 2004
  • LOD is a widely used technique in 3D game and animation to represent large 3D data sets smoothly in real-time. Most LOD algorithms use a binary tree to keep the ancestor information. A new algorithm proposed in this paper, however, do not keep the ancestor information, thus use the less memory space and rather increase the rendering performance. To verify the efficiency of the proposed algorithm, performance comparison with ROAM is conducted in real-time 3D terrain navigation. Result shows that the proposed algorithm uses about 1/4 of the memory space of ROAM and about 4 times faster than ROAM.

Securing a Cyber Physical System in Nuclear Power Plants Using Least Square Approximation and Computational Geometric Approach

  • Gawand, Hemangi Laxman;Bhattacharjee, A.K.;Roy, Kallol
    • Nuclear Engineering and Technology
    • /
    • v.49 no.3
    • /
    • pp.484-494
    • /
    • 2017
  • In industrial plants such as nuclear power plants, system operations are performed by embedded controllers orchestrated by Supervisory Control and Data Acquisition (SCADA) software. A targeted attack (also termed a control aware attack) on the controller/SCADA software can lead a control system to operate in an unsafe mode or sometimes to complete shutdown of the plant. Such malware attacks can result in tremendous cost to the organization for recovery, cleanup, and maintenance activity. SCADA systems in operational mode generate huge log files. These files are useful in analysis of the plant behavior and diagnostics during an ongoing attack. However, they are bulky and difficult for manual inspection. Data mining techniques such as least squares approximation and computational methods can be used in the analysis of logs and to take proactive actions when required. This paper explores methodologies and algorithms so as to develop an effective monitoring scheme against control aware cyber attacks. It also explains soft computation techniques such as the computational geometric method and least squares approximation that can be effective in monitor design. This paper provides insights into diagnostic monitoring of its effectiveness by attack simulations on a four-tank model and using computation techniques to diagnose it. Cyber security of instrumentation and control systems used in nuclear power plants is of paramount importance and hence could be a possible target of such applications.

Audio and Video Bimodal Emotion Recognition in Social Networks Based on Improved AlexNet Network and Attention Mechanism

  • Liu, Min;Tang, Jun
    • Journal of Information Processing Systems
    • /
    • v.17 no.4
    • /
    • pp.754-771
    • /
    • 2021
  • In the task of continuous dimension emotion recognition, the parts that highlight the emotional expression are not the same in each mode, and the influences of different modes on the emotional state is also different. Therefore, this paper studies the fusion of the two most important modes in emotional recognition (voice and visual expression), and proposes a two-mode dual-modal emotion recognition method combined with the attention mechanism of the improved AlexNet network. After a simple preprocessing of the audio signal and the video signal, respectively, the first step is to use the prior knowledge to realize the extraction of audio characteristics. Then, facial expression features are extracted by the improved AlexNet network. Finally, the multimodal attention mechanism is used to fuse facial expression features and audio features, and the improved loss function is used to optimize the modal missing problem, so as to improve the robustness of the model and the performance of emotion recognition. The experimental results show that the concordance coefficient of the proposed model in the two dimensions of arousal and valence (concordance correlation coefficient) were 0.729 and 0.718, respectively, which are superior to several comparative algorithms.

Software Fault Prediction using Semi-supervised Learning Methods (세미감독형 학습 기법을 사용한 소프트웨어 결함 예측)

  • Hong, Euyseok
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.19 no.3
    • /
    • pp.127-133
    • /
    • 2019
  • Most studies of software fault prediction have been about supervised learning models that use only labeled training data. Although supervised learning usually shows high prediction performance, most development groups do not have sufficient labeled data. Unsupervised learning models that use only unlabeled data for training are difficult to build and show poor performance. Semi-supervised learning models that use both labeled data and unlabeled data can solve these problems. Self-training technique requires the fewest assumptions and constraints among semi-supervised techniques. In this paper, we implemented several models using self-training algorithms and evaluated them using Accuracy and AUC. As a result, YATSI showed the best performance.

Developing of New a Tensorflow Tutorial Model on Machine Learning : Focusing on the Kaggle Titanic Dataset (텐서플로우 튜토리얼 방식의 머신러닝 신규 모델 개발 : 캐글 타이타닉 데이터 셋을 중심으로)

  • Kim, Dong Gil;Park, Yong-Soon;Park, Lae-Jeong;Chung, Tae-Yun
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.14 no.4
    • /
    • pp.207-218
    • /
    • 2019
  • The purpose of this study is to develop a model that can systematically study the whole learning process of machine learning. Since the existing model describes the learning process with minimum coding, it can learn the progress of machine learning sequentially through the new model, and can visualize each process using the tensor flow. The new model used all of the existing model algorithms and confirmed the importance of the variables that affect the target variable, survival. The used to classification training data into training and verification, and to evaluate the performance of the model with test data. As a result of the final analysis, the ensemble techniques is the all tutorial model showed high performance, and the maximum performance of the model was improved by maximum 5.2% when compared with the existing model using. In future research, it is necessary to construct an environment in which machine learning can be learned regardless of the data preprocessing method and OS that can learn a model that is better than the existing performance.

Analysis of Diagnosis Algorithm Implemented in TCU for High-Speed Tracked Vehicles (고속 무한궤도 차량용 변속제어기 진단 알고리즘 분석)

  • Jung, Gyuhong
    • Journal of Drive and Control
    • /
    • v.15 no.4
    • /
    • pp.30-38
    • /
    • 2018
  • Electronic control units (ECUs) are currently popular, and have evolved further towards the high-end application of autonomous vehicles in the automotive industry. Such digital technologies have also become widespread, in agriculture and construction equipment. Likewise, transmission control of high-speed tracked vehicles is based on the transmission control unit (TCU), performing complex gear change control functions, and diagnostic algorithms (a TCU's self-diagnostic and reporting capability of malfunction data through CAN communication). Since all functions of TCU are implemented by embedded-software, it is hardly possible to analyze specifications by reverse engineering. In this paper a real-time transmission simulator adaptable to TCU is presented, for analysis of diagnosis algorithm and standards. Signal simulation circuits are deliberately designed considering electrical characteristics of TCU inputs and various analysis tools, such as analog input auto scan function, and global output enable switch, are implemented in software. Test results from hardware-in-the-loop simulator verify tolerance time for each error, as well as cause of fault, error reset conditions.

Power-based Side-Channel Analysis Against AES Implementations: Evaluation and Comparison

  • Benhadjyoussef, Noura;Karmani, Mouna;Machhout, Mohsen
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.4
    • /
    • pp.264-271
    • /
    • 2021
  • From an information security perspective, protecting sensitive data requires utilizing algorithms which resist theoretical attacks. However, treating an algorithm in a purely mathematical fashion or in other words abstracting away from its physical (hardware or software) implementation opens the door to various real-world security threats. In the modern age of electronics, cryptanalysis attempts to reveal secret information based on cryptosystem physical properties, rather than exploiting the theoretical weaknesses in the implemented cryptographic algorithm. The correlation power attack (CPA) is a Side-Channel Analysis attack used to reveal sensitive information based on the power leakages of a device. In this paper, we present a power Hacking technique to demonstrate how a power analysis can be exploited to reveal the secret information in AES crypto-core. In the proposed case study, we explain the main techniques that can break the security of the considered crypto-core by using CPA attack. Using two cryptographic devices, FPGA and 8051 microcontrollers, the experimental attack procedure shows that the AES hardware implementation has better resistance against power attack compared to the software one. On the other hand, we remark that the efficiency of CPA attack depends statistically on the implementation and the power model used for the power prediction.