• Title/Summary/Keyword: support optimization

Search Result 765, Processing Time 0.025 seconds

Parameter search methodology of support vector machines for improving performance (속도 향상을 위한 서포트 벡터 머신의 파라미터 탐색 방법론)

  • Lee, Sung-Bo;Kim, Jae-young;Kim, Cheol-Hong;Kim, Jong-Myon
    • Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology
    • /
    • v.7 no.3
    • /
    • pp.329-337
    • /
    • 2017
  • This paper proposes a search method that explores parameters C and σ values of support vector machines (SVM) to improve performance while maintaining search accuracy. A traditional grid search method requires tremendous computational times because it searches all available combinations of C and σ values to find optimal combinations which provide the best performance of SVM. To address this issue, this paper proposes a deep search method that reduces computational time. In the first stage, it divides C-σ- accurate metrics into four regions, searches a median value of each region, and then selects a point of the highest accurate value as a start point. In the second stage, the selected start points are re-divided into four regions, and then the highest accurate point is assigned as a new search point. In the third stage, after eight points near the search point. are explored and the highest accurate value is assigned as a new search point, corresponding points are divided into four parts and it calculates an accurate value. In the last stage, it is continued until an accurate metric value is the highest compared to the neighborhood point values. If it is not satisfied, it is repeated from the second stage with the input level value. Experimental results using normal and defect bearings show that the proposed deep search algorithm outperforms the conventional algorithms in terms of performance and search time.

A Study on Rapid Color Difference Discrimination for Fabrics using Digital Imaging Device (디지털 화상 장치를 이용한 섬유제품류 간이 색차판별에 관한 연구)

  • Park, Jae Woo;Byun, Kisik;Cho, Sung-Yong;Kim, Byung-Soon;Oh, Jun-Ho
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.20 no.8
    • /
    • pp.29-37
    • /
    • 2019
  • Textile quality management targets the physical properties of fabrics and the subjective discriminations of color and fitting. Color is the most representative quality factor that consumers can use to evaluate quality levels without any instruments. For this reason, quantification using a color discrimination device has been used for statistical quality management in the textile industry. However, small and medium-sized domestic textile manufacturers use only visual inspection for color discrimination. As a result, color discrimination is different based on the inspectors' individual tendencies and work procedures. In this research, we want to develop a textile industry-friendly quality management method, evaluating the possibility of rapid color discrimination using a digital imaging device, which is one of the office-automation instruments. The results show that an imaging process-based color discrimination method is highly correlated with conventional color discrimination instruments ($R^2=0.969$), and is also applicable to field discrimination of the manufacturing process, or for different lots. Moreover, it is possible to recognize quality management factors by analyzing color components, ${\Delta}L$, ${\Delta}a$, ${\Delta}b$. We hope that our rapid discrimination method will be a substitute technique for conventional color discrimination instruments via elaboration and optimization.

HEVC Encoder Optimization using Depth Information (깊이정보를 이용한 HEVC의 인코더 고속화 방법)

  • Lee, Yoon Jin;Bae, Dong In;Park, Gwang Hoon
    • Journal of Broadcast Engineering
    • /
    • v.19 no.5
    • /
    • pp.640-655
    • /
    • 2014
  • Many of today's video systems have additional depth camera to provide extra features such as 3D support. Thanks to these changes made in multimedia system, it is now much easier to obtain depth information of the video. Depth information can be used in various areas such as object classification, background area recognition, and so on. With depth information, we can achieve even higher coding efficiency compared to only using conventional method. Thus, in this paper, we propose the 2D video coding algorithm which uses depth information on top of the next generation 2D video codec HEVC. Background area can be recognized with depth information and by performing HEVC with it, coding complexity can be reduced. If current CU is background area, we propose the following three methods, 1) Earlier stop split structure of CU with PU SKIP mode, 2) Limiting split structure of CU with CU information in temporal position, 3) Limiting the range of motion searching. We implement our proposal using HEVC HM 12.0 reference software. With these methods results shows that encoding complexity is reduced more than 40% with only 0.5% BD-Bitrate loss. Especially, in case of video acquired through the Kinect developed by Microsoft Corp., encoding complexity is reduced by max 53% without a loss of quality. So, it is expected that these techniques can apply real-time online communication, mobile or handheld video service and so on.

An Adaptive Materialized Query Selection Method in a Mediator System (미디에이터 시스템의 적응적 구체화 질의 선택방법)

  • Joo, Kil-Hong;Lee, Won-Suk
    • The KIPS Transactions:PartD
    • /
    • v.11D no.1
    • /
    • pp.83-94
    • /
    • 2004
  • Recent researches which purport to Integrate distributed information have been concentrated on developing efficient mediator systems that not only provide a high degree of autonomy for local users but also support the flexible integration of required functions for global users. However, there has been little attention on how to evaluate a global query in a mediator. A global query is transformed into a set of its sub-queries and each sub-query is the unit of evaluation in a remote server. Therefore, it is possible to speed up the execution of a global query if the previous results of frequently evaluated sub-queries are materialized in a mediator. Since the Integration schema of a mediator can be incrementally modified and the evaluation frequency of a global query can also be continuously varied, query usage should be carefully monitored to determine the optimized set of materialized sub-queries. Furthermore, as the number of sub-queries increases, the optimization process itself may take too long, so that the optimized set Identified by a long optimization process nay become obsolete due to the recent change of query usage. This paper proposes the adaptive selection of materialized sub-queries such that available storage in a mediator can be highly utilized at any time. In order to differentiate the recent usage of a query from the past, the accumulated usage frequency of a query decays as time goes by.

Design of Data Fusion and Data Processing Model According to Industrial Types (산업유형별 데이터융합과 데이터처리 모델의 설계)

  • Jeong, Min-Seung;Jin, Seon-A;Cho, Woo-Hyun
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.6 no.2
    • /
    • pp.67-76
    • /
    • 2017
  • In industrial site in various fields it will be generated in combination with large amounts of data have a correlation. It is able to collect a variety of data in types of industry process, but they are unable to integrate each other's association between each process. For the data of the existing industry, the set values of the molding condition table are input by the operator as an arbitrary value When a problem occurs in the work process. In this paper, design the fusion and analysis processing model of data collected for each industrial type, Prediction Case(Automobile Connect), a through for corporate earnings improvement and process manufacturing industries such as master data through standard molding condition table and the production history file comparison collected during the manufacturing process and reduced failure rate with a new molding condition table digitized by arbitrary value for worker, a new pattern analysis and reinterpreted for various malfunction factors and exceptions, increased productivity, process improvement, the cost savings. It can be designed in a variety of data analysis and model validation. In addition, to secure manufacturing process of objectivity, consistency and optimization by standard set values analyzed and verified and may be optimized to support the industry type, fits optimization(standard setting) techniques through various pattern types.

A study on the feasibility assessment model of urban utility tunnel by analytic hierarchy process (계층의사분석 기법을 적용한 도심지 공동구 타당성 평가모델 연구)

  • Chung, Jee-Seung;Na, Gwi-Tae
    • Journal of Korean Tunnelling and Underground Space Association
    • /
    • v.20 no.1
    • /
    • pp.131-144
    • /
    • 2018
  • The urban center of a large city has a high concentration ratio of population, commerce, and traffic. Therefore, the expected effect is high from the introduction of the urban utility tunnel and it also has sufficient economic feasibility considering life cycle cost. Moreover, the construction cost can be greatly reduced if it is included in a large underground development such as a subway or a complex transfer center construction. However, it is not reflected in actual underground development plan. When planning a urban utility tunnel in Korea, it is expected to have difficulties such as the cost of relocation of the existing Life-Line, conflicts among the individual facility institutions, procurement of construction resources and sharing. Furthermore, it is possible to promote the project only if a consensus is drawn up by a collective council composed of all facilities and project developers. Therefore, an optimal alternative should be proposed using economic analysis and feasibility assessment system. In this study, the analytic hierarchy process (AHP) is performed considering the characteristics of urban areas and the importance of each indicator is quantified. As a result, we can support reasonable design capacity optimization using the feasibility assessment system.

Lightweight Loop Invariant Code Motion for Java Just-In-Time Compiler on Itanium (Itanium상의 자바 적시 컴파일러를 위한 가벼운 루프 불변 코드 이동)

  • Yu Jun-Min;Choi Hyung-Kyu;Moon Soo-Mook
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.3
    • /
    • pp.215-226
    • /
    • 2005
  • Loop invariant code motion (LICM) optimization includes relatively heavy code analyses, thus being not readily applicable to Java Just-In-Time (JIT) compilation where the JIT compilation time is part of the whole running time. 'Classical' LICM optimization first analyzes the code and constructs both the def-use chains and the use-def chains. which are then used for performing code motions. This paper proposes a light-weight LICM algorithm, which requires only the def-use chains of loop invariant code (without use-def chains) by exploiting the fact that the Java virtual machine is based on a stack machine, hence generating code with simpler patterns. We also propose two techniques that allow more code motions than classical LICM techniques. First, unlike previous JIT techniques that uses LICM only in single-path loops for simplicity, we apply LICM to multi-path loops (natural loops) safely for partially redundant code. Secondly, we move loop-invariant, partially-redundant null pointer check code via predication support in Itanium. The proposed techniques were implemented in a JIT compiler for Itanium processor on ORP (Open Runtime Platform) Java virtual machine of Intel. On SPECjvrn98 benchmarks, the proposed technique increases the JIT compilation overhead by the geometric mean of 1.3%, yet it improves the total running time by the geometric mean of 2.2%.

Structural Design Optimization of Lightweight Offshore Helidecks Using a Genetic Algorithm and AISC Standard Sections (유전 알고리듬 및 AISC 표준 단면을 사용한 경량화 헬리데크 구조 최적설계)

  • Sim, Kichan;Kim, Byungmo;Kim, Chanyeong;Ha, Seung-Hyun
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.32 no.6
    • /
    • pp.383-390
    • /
    • 2019
  • A helideck is one of the essential structures in offshore platforms for the transportation of goods and operating personnel between land and offshore sites. As such, it should be carefully designed and installed for the safety of the offshore platform. In this study, a structural design optimization method for a lightweight offshore helideck is developed based on a genetic algorithm and an attainable design set concept. A helideck consists of several types of structural members such as plates, girders, stiffeners, trusses, and support elements, and the dimensions of these members are typically pre-defined by manufacturers. Therefore, design sets are defined by collecting the standard section data for these members from the American Institute of Steel Construction (AISC), and integer section labels are assigned as design variables in the genetic algorithm. The objective is to minimize the total weight of the offshore helideck while satisfying the maximum allowable stress criterion under various loading conditions including self-weight, wind direction, landing position, and landing condition. In addition, the unity check process is also utilized for additional verification of structural safety against buckling failure of the helideck.

Parameter optimization of agricultural reservoir long-term runoff model based on historical data (실측자료기반 농업용 저수지 장기유출모형 매개변수 최적화)

  • Hong, Junhyuk;Choi, Youngje;Yi, Jaeeung
    • Journal of Korea Water Resources Association
    • /
    • v.54 no.2
    • /
    • pp.93-104
    • /
    • 2021
  • Due to climate change the sustainable water resources management of agricultural reservoirs, the largest number of reservoirs in Korea, has become important. However, the DIROM, rainfall-runoff model for calculating agricultural reservoir inflow, has used regression equation developed in the 1980s. This study has optimized the parameters of the DIROM using the genetic algorithm (GA) based on historical inflow data for some agricultural reservoirs that recently begun to observe inflow data. The result showed that the error between the historical inflow and simulated inflow using the optimal parameters was decreased by about 80% compared with the annual inflow with the existing parameters. The correlation coefficient and root mean square error with the historical inflow increased to 0.64 and decreased to 28.2 × 103 ㎥, respectively. As a result, if the DIROM uses the optimal parameters based on the historical inflow of agricultural reservoirs, it will be possible to calculate the long-term reservoir inflow with high accuracy. This study will contribute to future research using the historical inflow of agricultural reservoirs and improvement of the rainfall-runoff model parameters. Furthermore, the reliable long-term inflow data will support for sustainable reservoir management and agricultural water supply.

A Study on Optimization of Perovskite Solar Cell Light Absorption Layer Thin Film Based on Machine Learning (머신러닝 기반 페로브스카이트 태양전지 광흡수층 박막 최적화를 위한 연구)

  • Ha, Jae-jun;Lee, Jun-hyuk;Oh, Ju-young;Lee, Dong-geun
    • The Journal of the Korea Contents Association
    • /
    • v.22 no.7
    • /
    • pp.55-62
    • /
    • 2022
  • The perovskite solar cell is an active part of research in renewable energy fields such as solar energy, wind, hydroelectric power, marine energy, bioenergy, and hydrogen energy to replace fossil fuels such as oil, coal, and natural gas, which will gradually disappear as power demand increases due to the increase in use of the Internet of Things and Virtual environments due to the 4th industrial revolution. The perovskite solar cell is a solar cell device using an organic-inorganic hybrid material having a perovskite structure, and has advantages of replacing existing silicon solar cells with high efficiency, low cost solutions, and low temperature processes. In order to optimize the light absorption layer thin film predicted by the existing empirical method, reliability must be verified through device characteristics evaluation. However, since it costs a lot to evaluate the characteristics of the light-absorbing layer thin film device, the number of tests is limited. In order to solve this problem, the development and applicability of a clear and valid model using machine learning or artificial intelligence model as an auxiliary means for optimizing the light absorption layer thin film are considered infinite. In this study, to estimate the light absorption layer thin-film optimization of perovskite solar cells, the regression models of the support vector machine's linear kernel, R.B.F kernel, polynomial kernel, and sigmoid kernel were compared to verify the accuracy difference for each kernel function.