• Title/Summary/Keyword: Annealing Algorithm

Search Result 439, Processing Time 0.026 seconds

A Study on Development of Convergence Time in Nonlinear Optimization Problem (비선형 최적화의 수렴속도 개선에 관한 연구)

  • Lee, Young-J.;Lee, Kwon-S.;Lee, Jun-T.
    • Proceedings of the KIEE Conference
    • /
    • 1993.07a
    • /
    • pp.348-351
    • /
    • 1993
  • The simulated annealing(SA) algorithm is a stochastic strategy for search of the ground state and a powerful tool for optimization. based on the anneal ins process used for the crystallization in physical systems. It's main disadvantage is the long convergence time. Therefore, this paper shows that the new algorithm using SA can be applied to reduce the computation time. This idea has been used to solve the estimation problem of the nonlinear parameter.

  • PDF

Demand-based charging strategy for wireless rechargeable sensor networks

  • Dong, Ying;Wang, Yuhou;Li, Shiyuan;Cui, Mengyao;Wu, Hao
    • ETRI Journal
    • /
    • v.41 no.3
    • /
    • pp.326-336
    • /
    • 2019
  • A wireless power transfer technique can solve the power capacity problem in wireless rechargeable sensor networks (WRSNs). The charging strategy is a wide-spread research problem. In this paper, we propose a demand-based charging strategy (DBCS) for WRSNs. We improved the charging programming in four ways: clustering method, selecting to-be-charged nodes, charging path, and charging schedule. First, we proposed a multipoint improved K-means (MIKmeans) clustering algorithm to balance the energy consumption, which can group nodes based on location, residual energy, and historical contribution. Second, the dynamic selection algorithm for charging nodes (DSACN) was proposed to select on-demand charging nodes. Third, we designed simulated annealing based on performance and efficiency (SABPE) to optimize the charging path for a mobile charging vehicle (MCV) and reduce the charging time. Last, we proposed the DBCS to enhance the efficiency of the MCV. Simulations reveal that the strategy can achieve better performance in terms of reducing the charging path, thus increasing communication effectiveness and residual energy utility.

A Clustered Reconfigurable Interconnection Network BIST Based on Signal Probabilities of Deterministic Test Sets (결정론적 테스트 세트의 신호확률에 기반을 둔 clustered reconfigurable interconnection network 내장된 자체 테스트 기법)

  • Song Dong-Sup;Kang Sungho
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.42 no.12
    • /
    • pp.79-90
    • /
    • 2005
  • In this paper, we propose a new clustered reconfigurable interconnect network (CRIN) BIST to improve the embedding probabilities of random-pattern-resistant-patterns. The proposed method uses a scan-cell reordering technique based on the signal probabilities of given test cubes and specific hardware blocks that increases the embedding probabilities of care bit clustered scan chain test cubes. We have developed a simulated annealing based algorithm that maximizes the embedding probabilities of scan chain test cubes to reorder scan cells, and an iterative algorithm for synthesizing the CRIN hardware. Experimental results demonstrate that the proposed CRIN BIST technique achieves complete fault coverage with lower storage requirement and shorter testing time in comparison with the conventional methods.

Optimal depth for dipping sonar system using optimization algorithm (최적화 알고리즘을 적용한 디핑소나 최적심도 산출)

  • An, Sangkyum
    • The Journal of the Acoustical Society of Korea
    • /
    • v.39 no.6
    • /
    • pp.541-548
    • /
    • 2020
  • To overcome the disadvantage of hull mounted sonar, many countries operate dipping sonar system for helicopter. Although limited in performance, this system has the advantage of ensuring the survivability of the surface ship and improving the detection performance by adjusting the depth according to the ocean environment. In this paper, a method to calculate the optimal depth of the dipping sonar for helicopters is proposed by applying an optimization algorithm. In addition, in order to evaluate the performance of the sonar, the Sonar Performance Function (SPF) is defined to consider the ocean environment, the depth of the target and the depth of the dipping sonar. In order to reduce the calculation time, the optimal depth is calculated by applying Simulated Annealing (SA), one of the optimization algorithms. For the verification of accuracy, the optimal depth calculated by applying the optimization technique is compared with the calculation of the SPF. This paper also provides the results of calculation of optimal depth for ocean environment in the East sea.

An Optimized Model for the Local Compression Deformation of Soft Tissue

  • Zhang, Xiaorui;Yu, Xuefeng;Sun, Wei;Song, Aiguo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.14 no.2
    • /
    • pp.671-686
    • /
    • 2020
  • Due to the long training time and high training cost of traditional surgical training methods, the emerging virtual surgical training method has gradually replaced it as the mainstream. However, the virtual surgical system suffers from poor authenticity and high computational cost problems. For overcoming the deficiency of these problems, we propose an optimized model for the local compression deformation of soft tissue. This model uses a simulated annealing algorithm to optimize the parameters of the soft tissue model to improve the authenticity of the simulation. Meanwhile, although the soft tissue deformation is divided into local deformation region and non-deformation region, our proposed model only needs to calculate and update the deformation region, which can improve the simulation real-time performance. Besides, we define a compensation strategy for the "superelastic" effect which often occurs with the mass-spring model. To verify the validity of the model, we carry out a compression simulation experiment of abdomen and human foot and compare it with other models. The experimental results indicate the proposed model is realistic and effective in soft tissue compression simulation, and it outperforms other models in accuracy and real-time performance.

The Algorithm Design and Implement of Microarray Data Classification using the Byesian Method (베이지안 기법을 적용한 마이크로어레이 데이터 분류 알고리즘 설계와 구현)

  • Park, Su-Young;Jung, Chai-Yeoung
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.10 no.12
    • /
    • pp.2283-2288
    • /
    • 2006
  • As development in technology of bioinformatics recently makes it possible to operate micro-level experiments, we can observe the expression pattern of total genome through on chip and analyze the interactions of thousands of genes at the same time. Thus, DNA microarray technology presents the new directions of understandings for complex organisms. Therefore, it is required how to analyze the enormous gene information obtained through this technology effectively. In this thesis, We used sample data of bioinformatics core group in harvard university. It designed and implemented system that evaluate accuracy after dividing in class of two using Bayesian algorithm, ASA, of feature extraction method through normalization process, reducing or removing of noise that occupy by various factor in microarray experiment. It was represented accuracy of 98.23% after Lowess normalization.

A Study on Development of Automatic Westing Software by Vectorizing Technique (벡터라이징을 이용한 자동부재배치 소프트웨어 개발에 관한 연구)

  • Lho T.J.;Kang D.J.;Kim M.S.;Park Jun-Yeong;Park S.W.
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2005.10a
    • /
    • pp.748-753
    • /
    • 2005
  • Among processes to manufacture parts from footwear materials like upper leathers, one of the most essential processes is the cutting one optimally arranging lots of parts on raw footwear materials and cutting. A new nesting strategy was proposed for the 2-dimensional part layout by using a two-stage approach, where which can be effectively used for water jet cutting. In the initial layout stage, a SOAL(Self-Organization Assisted Layout) based on the combination of FCM(Fuzzy C-Means) and SOM was adopted. In the layout improvement stage, SA(Simulated Annealing) based approach was adopted for a finer layout. The proposed approach saves much CPU time through a two-stage approach scheme, while other annealing-based algorithm so far reported fur a nesting problem are computationally expensive. The proposed nesting approach uses the stochastic process, and has a much higher possibility to obtain a global solution than the deterministic searching technique. We developed the automatic nesting software of NST(ver.1.1) software for footwear industry by implementing of these proposed algorithms. The NST software was applied by the optimized automatic arrangement algorithm to cut without the loss of leathers. if possible, after detecting damage areas. Also, NST software can consider about several features in not only natural loathers but artificial ones. Lastly, the NST software can reduce a required time to implement generation of NC code. cutting time, and waste of raw materials because the NST software automatically performs parts arrangement, cutting paths generation and finally NC code generation, which are needed much effect and time to generate them manually.

  • PDF

An Offloading Scheduling Strategy with Minimized Power Overhead for Internet of Vehicles Based on Mobile Edge Computing

  • He, Bo;Li, Tianzhang
    • Journal of Information Processing Systems
    • /
    • v.17 no.3
    • /
    • pp.489-504
    • /
    • 2021
  • By distributing computing tasks among devices at the edge of networks, edge computing uses virtualization, distributed computing and parallel computing technologies to enable users dynamically obtain computing power, storage space and other services as needed. Applying edge computing architectures to Internet of Vehicles can effectively alleviate the contradiction among the large amount of computing, low delayed vehicle applications, and the limited and uneven resource distribution of vehicles. In this paper, a predictive offloading strategy based on the MEC load state is proposed, which not only considers reducing the delay of calculation results by the RSU multi-hop backhaul, but also reduces the queuing time of tasks at MEC servers. Firstly, the delay factor and the energy consumption factor are introduced according to the characteristics of tasks, and the cost of local execution and offloading to MEC servers for execution are defined. Then, from the perspective of vehicles, the delay preference factor and the energy consumption preference factor are introduced to define the cost of executing a computing task for another computing task. Furthermore, a mathematical optimization model for minimizing the power overhead is constructed with the constraints of time delay and power consumption. Additionally, the simulated annealing algorithm is utilized to solve the optimization model. The simulation results show that this strategy can effectively reduce the system power consumption by shortening the task execution delay. Finally, we can choose whether to offload computing tasks to MEC server for execution according to the size of two costs. This strategy not only meets the requirements of time delay and energy consumption, but also ensures the lowest cost.

A Video Abstraction Algorithm Reflecting Various Users Requirement (사용자의 요구를 반영하는 동영상 요약 알고리즘)

  • 정진국;홍승욱;낭종호;하명환;정병희;김경수
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.7_8
    • /
    • pp.599-609
    • /
    • 2003
  • Video abstraction is a process to pick up some important shots on a video, while the important shots might vary on the persons subjectivity. Previous works on video abstraction use only one low level feature to choose an important shot. This thesis proposes an abstraction scheme that selects a set of shots which simultaneously satisfies the desired features(or objective functions) of a good abstraction. Since the complexity of the computation to find a set of shots which maximizes the sum of object function values is $0({2^n})$, the proposed .scheme uses a simulated annealing based searching method to find the suboptimal value within a short period of time. Upon the experimental results on various videos, we could argue that the proposed abstraction scheme could produce a reasonable video abstraction. The proposed abstraction scheme used to build a digital video library.

Determining Optimal WIP Level and Buffer Size Using Simulated Annealing in Semiconductor Production Line (반도체 생산라인에서 SA를 이용한 최적 WIP수준과 버퍼사이즈 결정)

  • Jeong, Jaehwan;Jang, Sein;Lee, Jonghwan
    • Journal of the Semiconductor & Display Technology
    • /
    • v.20 no.3
    • /
    • pp.57-64
    • /
    • 2021
  • The domestic semiconductor industry can produce various products that will satisfy customer needs by diversifying assembly parts and increasing compatibility between them. It is necessary to improve the production line as a method to reduce the work-in-process inventory (WIP) in the assembly line, the idle time of the worker, and the idle time of the process. The improvement of the production line is to balance the capabilities of each process as a whole, and to determine the timing of product input or the order of the work process so that the time required between each process is balanced. The purpose of this study is to find the optimal WIP and buffer size through SA (Simulated Annealing) that minimizes lead time while matching the number of two parts in a parallel assembly line with bottleneck process. The WIP level and buffer size obtained by the SA algorithm were applied to the CONWIP and DBR systems, which are the existing production systems, and the simulation was performed by applying them to the new hybrid production system. Here, the Hybrid method is a combination of CONWIP and DBR methods, and it is a production system created by setting new rules. As a result of the Simulation, the result values were derived based on three criteria: lead time, production volume, and work-in-process inventory. Finally, the effect of the hybrid production method was verified through comparative analysis of the result values.