• Title/Summary/Keyword: pareto-optimal

Search Result 244, Processing Time 0.028 seconds

Numerical optimization of Wells turbine for wave energy extraction

  • Halder, Paresh;Rhee, Shin Hyung;Samad, Abdus
    • International Journal of Naval Architecture and Ocean Engineering
    • /
    • v.9 no.1
    • /
    • pp.11-24
    • /
    • 2017
  • The present work focuses multi-objective optimization of blade sweep for a Wells turbine. The blade-sweep parameters at the mid and the tip sections are selected as design variables. The peak-torque coefficient and the corresponding efficiency are the objective functions, which are maximized. The numerical analysis has been carried out by solving 3D RANS equations based on k-w SST turbulence model. Nine design points are selected within a design space and the simulations are run. Based on the computational results, surrogate-based weighted average models are constructed and the population based multi-objective evolutionary algorithm gave Pareto optimal solutions. The peak-torque coefficient and the corresponding efficiency are enhanced, and the results are analysed using CFD simulations. Two extreme designs in the Pareto solutions show that the peak-torque-coefficient is increased by 28.28% and the corresponding efficiency is decreased by 13.5%. A detailed flow analysis shows the separation phenomena change the turbine performance.

Optimization Design for Dynamic Characters of Electromagnetic Apparatus Based on Niche Sorting Multi-objective Particle Swarm Algorithm

  • Xu, Le;You, Jiaxin;Yu, Haidan;Liang, Huimin
    • Journal of Magnetics
    • /
    • v.21 no.4
    • /
    • pp.660-665
    • /
    • 2016
  • The electromagnetic apparatus plays an important role in high power electrical systems. It is of great importance to provide an effective approach for the optimization of the high power electromagnetic apparatus. However, premature convergence and few Pareto solution set of the optimization for electromagnetic apparatus always happen. This paper proposed a modified multi-objective particle swarm optimization algorithm based on the niche sorting strategy. Applying to the modified algorithm, this paper guarantee the better Pareto optimal front with an enhanced distribution. Aiming at shortcomings in the closing bounce and slow breaking velocity of electromagnetic apparatus, the multi-objective optimization model was established on the basis of the traditional optimization. Besides, by means of the improved multi-objective particle swarm optimization algorithm, this paper processed the model and obtained a series of optimized parameters (decision variables). Compared with other different classical algorithms, the modified algorithm has a satisfactory performance in the multi-objective optimization problems in the electromagnetic apparatus.

The optimization for the straight-channel PCHE size for supercritical CO2 Brayton cycle

  • Xu, Hong;Duan, Chengjie;Ding, Hao;Li, Wenhuai;Zhang, Yaoli;Hong, Gang;Gong, Houjun
    • Nuclear Engineering and Technology
    • /
    • v.53 no.6
    • /
    • pp.1786-1795
    • /
    • 2021
  • Printed Circuit Heat Exchanger (PCHE) is a widely used heat exchanger in the supercritical carbon dioxide (sCO2) Brayton cycle because it can work under high temperature and pressure, and has been a hot topic in Next Generation Nuclear Plant (NGNP) projects for use as recuperators and condensers. Most previous studies focused on channel structures or shapes. However, no clear advancement has so far been seen in the allover size of the PCHE. In this paper, we proposed an optimal size of the PCHE with a fixed volume. Two boundary conditions of PCHE were simulated, respectively. When the volume of PCHE was fixed, the heat transfer rate and pressure loss were picked as the optimization objectives. The Pareto front was obtained by the Multi-objective optimization procedure. We got the optimized number of PCHE channels under two different boundary conditions from the Pareto front. The comprehensive performance can be increased by 5.3% while holding in the same volume. The numerical results from this study can be used to improve the design of PCHE with straight channels.

Multi-objective optimization of submerged floating tunnel route considering structural safety and total travel time

  • Eun Hak Lee;Gyu-Jin Kim
    • Structural Engineering and Mechanics
    • /
    • v.88 no.4
    • /
    • pp.323-334
    • /
    • 2023
  • The submerged floating tunnel (SFT) infrastructure has been regarded as an emerging technology that efficiently and safely connects land and islands. The SFT route problem is an essential part of the SFT planning and design phase, with significant impacts on the surrounding environment. This study aims to develop an optimization model considering transportation and structure factors. The SFT routing problem was optimized based on two objective functions, i.e., minimizing total travel time and cumulative strains, using NSGA-II. The proposed model was applied to the section from Mokpo to Jeju Island using road network and wave observation data. As a result of the proposed model, a Pareto optimum curve was obtained, showing a negative correlation between the total travel time and cumulative strain. Based on the inflection points on the Pareto optimum curve, four optimal SFT routes were selected and compared to identify the pros and cons. The travel time savings of the four selected alternatives were estimated to range from 9.9% to 10.5% compared to the non-implemented scenario. In terms of demand, there was a substantial shift in the number of travel and freight trips from airways to railways and roadways. Cumulative strain, calculated based on SFT distance, support structure, and wave energy, was found to be low when the route passed through small islands. The proposed model helps decision-making in the planning and design phases of SFT projects, ultimately contributing to the progress of a safe, efficient, and sustainable SFT infrastructure.

Pareto Ratio and Inequality Level of Knowledge Sharing in Virtual Knowledge Collaboration: Analysis of Behaviors on Wikipedia (지식 공유의 파레토 비율 및 불평등 정도와 가상 지식 협업: 위키피디아 행위 데이터 분석)

  • Park, Hyun-Jung;Shin, Kyung-Shik
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.3
    • /
    • pp.19-43
    • /
    • 2014
  • The Pareto principle, also known as the 80-20 rule, states that roughly 80% of the effects come from 20% of the causes for many events including natural phenomena. It has been recognized as a golden rule in business with a wide application of such discovery like 20 percent of customers resulting in 80 percent of total sales. On the other hand, the Long Tail theory, pointing out that "the trivial many" produces more value than "the vital few," has gained popularity in recent times with a tremendous reduction of distribution and inventory costs through the development of ICT(Information and Communication Technology). This study started with a view to illuminating how these two primary business paradigms-Pareto principle and Long Tail theory-relates to the success of virtual knowledge collaboration. The importance of virtual knowledge collaboration is soaring in this era of globalization and virtualization transcending geographical and temporal constraints. Many previous studies on knowledge sharing have focused on the factors to affect knowledge sharing, seeking to boost individual knowledge sharing and resolve the social dilemma caused from the fact that rational individuals are likely to rather consume than contribute knowledge. Knowledge collaboration can be defined as the creation of knowledge by not only sharing knowledge, but also by transforming and integrating such knowledge. In this perspective of knowledge collaboration, the relative distribution of knowledge sharing among participants can count as much as the absolute amounts of individual knowledge sharing. In particular, whether the more contribution of the upper 20 percent of participants in knowledge sharing will enhance the efficiency of overall knowledge collaboration is an issue of interest. This study deals with the effect of this sort of knowledge sharing distribution on the efficiency of knowledge collaboration and is extended to reflect the work characteristics. All analyses were conducted based on actual data instead of self-reported questionnaire surveys. More specifically, we analyzed the collaborative behaviors of editors of 2,978 English Wikipedia featured articles, which are the best quality grade of articles in English Wikipedia. We adopted Pareto ratio, the ratio of the number of knowledge contribution of the upper 20 percent of participants to the total number of knowledge contribution made by the total participants of an article group, to examine the effect of Pareto principle. In addition, Gini coefficient, which represents the inequality of income among a group of people, was applied to reveal the effect of inequality of knowledge contribution. Hypotheses were set up based on the assumption that the higher ratio of knowledge contribution by more highly motivated participants will lead to the higher collaboration efficiency, but if the ratio gets too high, the collaboration efficiency will be exacerbated because overall informational diversity is threatened and knowledge contribution of less motivated participants is intimidated. Cox regression models were formulated for each of the focal variables-Pareto ratio and Gini coefficient-with seven control variables such as the number of editors involved in an article, the average time length between successive edits of an article, the number of sections a featured article has, etc. The dependent variable of the Cox models is the time spent from article initiation to promotion to the featured article level, indicating the efficiency of knowledge collaboration. To examine whether the effects of the focal variables vary depending on the characteristics of a group task, we classified 2,978 featured articles into two categories: Academic and Non-academic. Academic articles refer to at least one paper published at an SCI, SSCI, A&HCI, or SCIE journal. We assumed that academic articles are more complex, entail more information processing and problem solving, and thus require more skill variety and expertise. The analysis results indicate the followings; First, Pareto ratio and inequality of knowledge sharing relates in a curvilinear fashion to the collaboration efficiency in an online community, promoting it to an optimal point and undermining it thereafter. Second, the curvilinear effect of Pareto ratio and inequality of knowledge sharing on the collaboration efficiency is more sensitive with a more academic task in an online community.

Optimization Methodology for Sales and Operations Planning by Stochastic Programming under Uncertainty : A Case Study in Service Industry (불확실성하에서의 확률적 기법에 의한 판매 및 실행 계획 최적화 방법론 : 서비스 산업)

  • Hwang, Seon Min;Song, Sang Hwa
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.39 no.4
    • /
    • pp.137-146
    • /
    • 2016
  • In recent years, business environment is faced with multi uncertainty that have not been suffered in the past. As supply chain is getting expanded and longer, the flow of information, material and production is also being complicated. It is well known that development service industry using application software has various uncertainty in random events such as supply and demand fluctuation of developer's capcity, project effective date after winning a contract, manpower cost (or revenue), subcontract cost (or purchase), and overrun due to developer's skill-level. This study intends to social contribution through attempts to optimize enterprise's goal by supply chain management platform to balance demand and supply and stochastic programming which is basically applied in order to solve uncertainty considering economical and operational risk at solution supplier. In Particular, this study emphasizes to determine allocation of internal and external manpower of developers using S&OP (Sales & Operations Planning) as monthly resource input has constraint on resource's capability that shared in industry or task. This study is to verify how Stochastic Programming such as Markowitz's MV (Mean Variance) model or 2-Stage Recourse Model is flexible and efficient than Deterministic Programming in software enterprise field by experiment with process and data from service industry which is manufacturing software and performing projects. In addition, this study is also to analysis how profit and labor input plan according to scope of uncertainty is changed based on Pareto Optimal, then lastly it is to enumerate limitation of the study extracted drawback which can be happened in real business environment and to contribute direction in future research considering another applicable methodology.

Optimization for the Design Parameters of Electric Locomotive Overhaul Maintenance Facility (전기 기관차 중수선 시설의 설계 변수 최적화)

  • Um, In-Sup;Cheon, Hyeon-Jae;Lee, Hong-Chul
    • Journal of the Korean Society for Railway
    • /
    • v.13 no.2
    • /
    • pp.222-228
    • /
    • 2010
  • In this paper, we propose a optimization approach for the Electric Locomotive Overhaul Maintenance Facility (ELOMF), which aims at the simulation optimization so as to meet the design specification. In simulation design, we consider the critical path and sensitivity analysis of the critical (dependent) factors and the design (independent) parameters for the parameter selection and reduction of the metamodel. Therefore, we construct the multi-objective non-linear programming. The objective function is normalized for the generalization of design parameter while the constraints are composed of the simulation-based regression metamodel for the critical factors and design factor's domain. Then the effective solution procedure based on the pareto optimal solution set is proposed. This approach provides a comprehensive approach for the optimization of Train Overhaul Maintenance Facility(TOMF)'s design parameters using the simulation and metamoels.

A Simulation-based Optimization Approach for the Selection of Design Factors (설계 변수 선택을 위한 시뮬레이션 기반 최적화)

  • Um, In-Sup;Cheon, Hyeon-Jae;Lee, Hong-Chul
    • Journal of the Korea Society for Simulation
    • /
    • v.16 no.2
    • /
    • pp.45-54
    • /
    • 2007
  • In this article, we propose a different modeling approach, which aims at the simulation optimization so as to meet the design specification. Generally, Multi objective optimization problem is formulated by dependent factors as objective functions and independent factors as constraints. However, this paper presents the critical(dependent) factors as objective function and design(independent) factors as constraints for the selection of design factors directly. The objective function is normalized far the generalization of design factors while the constraints are composed of the simulation-based regression metamodels fer the critical factors and design factor's domain. Then the effective and fast solution procedure based on the pareto optimal solution set is proposed. This paper provides a comprehensive framework for the system design using the simulation and metamodels. Therefore, the method developed for this research can be adopted for other enhancements in different but comparable situations.

  • PDF

Experimental validation of FE model updating based on multi-objective optimization using the surrogate model

  • Hwang, Yongmoon;Jin, Seung-seop;Jung, Ho-Yeon;Kim, Sehoon;Lee, Jong-Jae;Jung, Hyung-Jo
    • Structural Engineering and Mechanics
    • /
    • v.65 no.2
    • /
    • pp.173-181
    • /
    • 2018
  • In this paper, finite element (FE) model updating based on multi-objective optimization with the surrogate model for a steel plate girder bridge is investigated. Conventionally, FE model updating for bridge structures uses single-objective optimization with finite element analysis (FEA). In the case of the conventional method, computational burden occurs considerably because a lot of iteration are performed during the updating process. This issue can be addressed by replacing FEA with the surrogate model. The other problem is that the updating result from single-objective optimization depends on the condition of the weighting factors. Previous studies have used the trial-and-error strategy, genetic algorithm, or user's preference to obtain the most preferred model; but it needs considerable computation cost. In this study, the FE model updating method consisting of the surrogate model and multi-objective optimization, which can construct the Pareto-optimal front through a single run without considering the weighting factors, is proposed to overcome the limitations of the single-objective optimization. To verify the proposed method, the results of the proposed method are compared with those of the single-objective optimization. The comparison shows that the updated model from the multi-objective optimization is superior to the result of single-objective optimization in calculation time as well as the relative errors between the updated model and measurement.

Optimization of Stacking Strategies Considering Yard Occupancy Rate in an Automated Container Terminal (장치장 점유율을 고려한 자동화 컨테이너 터미널의 장치 위치 결정 전략 최적화)

  • Sohn, Min-Je;Park, Tae-Jin;Ryu, Kwang-Ryel
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.16 no.11
    • /
    • pp.1106-1110
    • /
    • 2010
  • This paper proposes a method of optimizing a stacking strategy for an automated container terminal using multi-objective evolutionary algorithms (MOEAs). Since the yard productivities of seaside and landside are conflicting objectives to be optimized, it is impossible to maximize them simultaneously. Therefore, we derive a Pareto optimal set instead of a single best solution using an MOEA. Preliminary experiments showed that the population is frequently stuck in local optima because of the difficulty of the given problem depending on the yard occupancy rate. To cope with this problem, we propose another method of simultaneously optimizing two problems with different difficulties so that diverse solutions can be preserved in the population. Experimental results showed the proposed method can derive better stacking policies than the compared method solving a single problem given the same computational costs.