• Title/Summary/Keyword: Process Re-engineering

Search Result 683, Processing Time 0.023 seconds

A new algorithm for design of support structures in additive manufacturing by using topology optimization

  • Haleh Sadat Kazemi;Seyed Mehdi Tavakkoli
    • Structural Engineering and Mechanics
    • /
    • v.86 no.1
    • /
    • pp.93-107
    • /
    • 2023
  • In this paper, a density based topology optimization is proposed for generating of supports required in additive manufacturing to maintain the overhanging regions of main structures during layer by layer fabrication process. For this purpose, isogeometric analysis method is employed to model geometry and structural analysis of main and support structures. In order to model the problem two cases are investigated. In the first case, design domain of supports can easily be separated from the main structure by using distinct isogeometric patches. The second case happens when the main structure itself is optimized by using topology optimization and the supports should be designed in the voids of optimum layout. In this case, in order to avoid boundary identification and re-meshing process for separating design domain of supports from main structure, a parameterization technique is proposed to identify the design domain of supports. To achieve this, two density functions are defined over the entire domain to describe the main structure and supporting areas. On the other hand, since supports are under gravity loads while main structure and its stiffness is not completed during manufacturing process, in the proposed method, stiffness of the main structure is considered to be trivial and the gravity loads are also naturally applied to design support structures. By doing so, the results show reasonable supports are created to protect, continuously, overhanging surfaces of the main structure. Several examples are presented to demonstrate the efficiency of the proposed method and compare the results with literature.

A study on the topology optimization of structures (구조물의 토폴로지 최적화에 관한 연구)

  • Park, Sang-Hun;Yun, Seong-Gi
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.21 no.8
    • /
    • pp.1241-1249
    • /
    • 1997
  • The problem of structural topology optimization can be relaxed and converted into the optimal density distribution problem. The optimal density distribution must be post-processed to get the real shape of the structure. The extracted shape can then be used for the next process, which is usually shape optmization based on the boundary movement method. In the practical point of view, it is very important to get the optimal density distribution from which the corresponding shape can easily be extracted. Among many other factors, the presence of checker-board patterns is a powerful barrier for the shape extraction job. The nature of checker-board patterns seems to be a numerical locking. In this paper, an efficient algorithm is presented to suppress the checker-board patterns. At each iteration, density is re-distributed after it is updated according to the optimization rule. The algorithm also results in the optimal density distribution whose corresponding shape has smooth boundary. Some examples are presented to show the performance of the density re-distribution algorithm. Checker-board patterns are successfully suppressed and the resulting shapes are considered very satisfactory.

Successful ERP Implementation Model : Exploratory Model from Ernst & Young PER (Package Enabled Reengineering) and Change Management Methodology (성공적 ERP구축 모델 : Ernst & Young의 PER(Package Enabled Reengineering) 방법론과 변화관리 방법론을 중심으로 한 탐색적 모델)

  • An, Joon-Mo;Park, Dong-Bae
    • Korean Management Science Review
    • /
    • v.15 no.2
    • /
    • pp.59-70
    • /
    • 1998
  • According to the Gartner Group, the market for ERP software in Korea is growing rapidly. However, the number of successful ERP implementations is quite few. Standard(unmodified) ERP's are built based on best practices. Improvements expected from successful implementation are based on best practices built into the software. Many improvements are lost through modification to "standard" software. Even minor changes in software can significantly reduce benefits. Both implementation time and risk factors are increased with modifications. We introduce a methodology, called package Enable Re-engineering (PER) and the main components of change management program by Ernst & Young. "To-Be" model could be developed through the software capabilities. And change management processes such as continuous education and self-developments are required. The philosophy of the change management processes is to let the software package drive the re-engineering practices and avoid moving the software toward the "As-IS" process. Extensive top management involvement, major focus on speed, extensive communication program, and "clear" picture of the future are essential components of change management. We are sure that the complied experiences and model have implications for practice and for academicians for their endeavors in their fields.

  • PDF

Characteristic on the Heating Deformation of Sleeve by Heating Method (열처리공법에 따른 Sleeve의 열처리 변형 특성)

  • Youn, Il-Joong;Lyu, Sung-Ki;An, Chang-Woo;Ahn, In-Hyo
    • Journal of the Korean Society of Safety
    • /
    • v.21 no.3 s.75
    • /
    • pp.1-7
    • /
    • 2006
  • Nowadays, out of other transmission parts, the sleeve is getting more and more important part for exact and smooth shifting from gear ratio change whenever drivers are needed. To exact and smooth shifting when drivers are needed, all the parts connected with gear shifting should be machined exactly and having dimensions designers are intended. Especially, in case of the sleeve that the most important functional part to shift from gear ratio change that drivers are intended, it needs high precision grade and quality in both sides runout and outer dia runout as well as inner spline small dia & large dia. Because it's assembled with the synchro hub spline and shifted directly with the mating cone. So, it should be applied the hear treatment(hereinafter referred to H.T.M.T) to prevent the friction and percussion loss from shifting with mating cone. At this time, the deformation problems are raised from almost H.T.M.T. process and it makes the inferior part.

Enhancement of Ginsenosides Conversion Yield by Steaming and Fermentation Process in Low Quality Fresh Ginseng (증숙 발효 공정에 의한 파삼의 진세노사이드 전환 수율 증진)

  • Choi, Woon Yong;Lim, Hye Won;Choi, Geun Pyo;Lee, Hyeon Yong
    • Korean Journal of Medicinal Crop Science
    • /
    • v.22 no.3
    • /
    • pp.223-230
    • /
    • 2014
  • This study was performed to enhance contents of low molecular ginsenoside using steaming and fermentation process in low quality fresh ginseng. For increase in contents of Rg2, Rg3, Rh2 and CK in low quality fresh ginseng, a steaming process was applied at $90^{\circ}C$ for 12 hr which was followed by fermentation process at Lactobacillus rhamnosus HK-9 incubated at $36^{\circ}C$ for 72 h. The contents of ginsenoside Rg1, Rb1, Rc, Re and Rd were decreased with the steaming associated with fermentation process but ginsenoside Rg2, Rg3, Rh2 and CK increased after process. It was found that under the steaming associated with fermentation process, low molecule ginsenosides such as Rg2, Rg3, Rh2 and CK were increased as 3.231 mg/g, 2.585 mg/g and 1.955 m/g and 2.478 mg/g, respectively. In addition, concentration of benzo[${\alpha}$]pyrene in extracts of the low quality fresh ginseng treated by the complex process was 0.11 ppm but it was 0.22 ppm when it was treated with the steaming process. This result could be caused by that the most efficiently breakdown of 1,2-glucoside and 1,4-glucoside linkage to backbone of ginsenosides by steaming associated with fermentation process. This results indicate that steaming process and fermenration process can increase in contents of Rg2, Rg3, Rh2 and CK in low quality fresh ginseng.

Low Computational Algorithm of Soft-Decision Extended BCH Decoding Algorithm for Next Generation DVB-RCS Systems (차세대 DVB-RCS 시스템을 위한 저 계산량 연판정 e-BCH 복호 알고리즘)

  • Park, Tae-Doo;Kim, Min-Hyuk;Lim, Byeong-Su;Jung, Ji-Won
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.22 no.7
    • /
    • pp.705-710
    • /
    • 2011
  • In this paper, we proposed the low computational complexity soft-decision e-BCH decoding algorithm based on the Chase algorithm. In order to make the test patterns, it is necessary to re-order the least reliable received symbols. In the process of ordering and finding optimal decoding symbols, high computational complexity is required. Therefore, this paper proposes the method of low computational complexity algorithm for soft-decision e-BCH decoding process.

Synthesis of Nano-sized TiO2 Powder using a Hydrothermal Process (수열합성법을 이용한 TiO2 나노 입자의 합성)

  • Kim, Gang Hyuk;Lee, Woo Jin;Kim, Donggyu;Lee, Sung Keun;Lee, Sang Hwa;Kim, Insoo
    • Korean Journal of Metals and Materials
    • /
    • v.48 no.6
    • /
    • pp.543-550
    • /
    • 2010
  • This paper investigated the synthesis conditions of nano-sized $TiO_2$ powder in a hydrothermal process at a temperature range of $100{\sim}180^{\circ}C$ considering the precipitation agent, precipitation pH, reaction temperature and time. Titanium hydroxide formed by $NH_4OH$ exhibited a lower crystallization temperature than that by NaOH and formed less aggregated $TiO_2$ particles. As the precipitation pH increased above 8, the shape of the particles changed from spherical to needle form, which appeared to be caused by dissolution and re-precipitation of the titanium hydroxide in an alkali environment.

The Problems in Digital Watermarking into Intra-Frames of H.264/AVC (H.264-기반 인트라 프레임의 디지털 워터마킹 문제)

  • Choi, Hyun-Jun;Seo, Young-Ho;Kim, Dong-Wook
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.13 no.2
    • /
    • pp.233-242
    • /
    • 2009
  • This paper intend to show the affect of the intra-prediction on the typical digital watermarking method and the fact that the watermarking method has very low effectiveness when it is performed for the intra-frames of H.264. The target watermarking method was the one for imperceptibility and robustness and was assumed to be performed during the intra-compression process by the H.264 technique. Also this method was assumed to insert watermark data and to extract it for certification if needed. The problem is that the resulting data from the re-engineering of the watermark insertion process to extract the watermark data is different from the one before. We experimentally showed that it stems from the intra-prediction itself. That is, we showed that the resulting image data from only compression without watermarking changes if it is re-compressed by the same conditions as the first compression and it is because the intra-prediction modes as well as the coefficient values change. Also, we applied one blind and one semi-blind watermarking methods to show that the typical attacks after watermarking makes this problem much more serious and lowers the effectiveness of the watermarking method dramatically. Therefore we concluded by considering the experimental data that a typical watermarking method which has been researched so far cannot guarantee the effectiveness of intra-frame watermarking and it is highly required to developed a new kind of methodologies.

Electromagnetic Forming Process Analysis Based on Coupled Simulations of Electromagnetic Analysis and Structural Analysis

  • Lee, Man Gi;Lee, Seung Hwan;Kim, Sunwoo;Kim, Jin Ho
    • Journal of Magnetics
    • /
    • v.21 no.2
    • /
    • pp.215-221
    • /
    • 2016
  • We conducted a phased electromagnetic forming process analysis (EFPA) over time through a coupling of electromagnetic analysis and structural analysis. The analysis is conducted through a direct linkage between electromagnetic analysis and structural analysis. The analysis process is repeated until the electric current is completely discharged by a formed coil. We calculate the forming force that affects the workpiece using MAXWELL, a commercial electromagnetic finite element analysis program. Then, we simulate plastic behavior by using the calculated forming force data as the forming force input to ANSYS, a commercial structure finite element analysis program. We calculate the forming force data by using the model shape in MAXWELL, a commercial electromagnetic finite element analysis program. We repeat the process until the current is fully discharged by the formed coil. Our results can be used to reduce the error in data transformation with a reduced number of data transformations, because the proposed approach directly links the electromagnetic analysis and the structural analysis after removing the step of the numerical analysis of a graph describing the forming force, unlike the existing electromagnetic forming process. Second, it is possible to simulate a more realistic forming force by keeping a certain distance between nodes using the re-mesh function during the repeated analysis until the current is completely discharged by the formed coil, based on the MAXWELL results. We compare and review the results of the EFPA using the peak value of the forming force that acts on the workpiece (which is the existing analysis method), and the proposed phased EFPA over time approach.

A Design for Extension Codec based on Legacy Codec (레거시 코덱 기반 확장 코덱 설계)

  • Young, Su Heo;Bang, Gun;Park, Gwang Hoon
    • Journal of Broadcast Engineering
    • /
    • v.20 no.4
    • /
    • pp.509-520
    • /
    • 2015
  • A design for the merge mode of three dimensional High Efficiency Video Coding (3D-HEVC) is proposed in this paper. The proposed design can reduce the implementation complexity by removing the duplicated modules of the HEVC. For the extension codec, the implementation complexity is as crucial as coding efficiency, meaning if possible, extension codec needs to be easily implemented through by reusing the design of the legacy codec as-is. However, the existing merging process of 3D-HEVC had been built-in integrated in the inside of the HEVC merging process. Thus the duplicated merging process of HEVC had to be fully re-implemented in the 3D-HEVC. Consequently the implementation complexity of the extension codec was very high. The proposed 3D-HEVC merge mode is divided into following two stages; the process to reuse the HEVC modules without any modification; and the reprocessing process for newly added and modified merging modules in 3D-HEVC. By applying the proposed method, the re-implemented HEVC modules, which accounted for 51.4% of 3D-HEVC merge mode confirmed through the operational analysis of algorithm, can be eliminated, while maintaining the same coding efficiency and computational complexity.