• Title/Summary/Keyword: Run-Length

Search Result 427, Processing Time 0.021 seconds

Reynolds and froude number effect on the flow past an interface-piercing circular cylinder

  • Koo, Bonguk;Yang, Jianming;Yeon, Seong Mo;Stern, Frederick
    • International Journal of Naval Architecture and Ocean Engineering
    • /
    • v.6 no.3
    • /
    • pp.529-561
    • /
    • 2014
  • The two-phase turbulent flow past an interface-piercing circular cylinder is studied using a high-fidelity orthogonal curvilinear grid solver with a Lagrangian dynamic subgrid-scale model for large-eddy simulation and a coupled level set and volume of fluid method for air-water interface tracking. The simulations cover the sub-critical and critical and post critical regimes of the Reynolds and sub and super-critical Froude numbers in order to investigate the effect of both dimensionless parameters on the flow. Significant changes in flow features near the air-water interface were observed as the Reynolds number was increased from the sub-critical to the critical regime. The interface makes the separation point near the interface much delayed for all Reynolds numbers. The separation region at intermediate depths is remarkably reduced for the critical Reynolds number regime. The deep flow resembles the single-phase turbulent flow past a circular cylinder, but includes the effect of the free-surface and the limited span length for sub-critical Reynolds numbers. At different Froude numbers, the air-water interface exhibits significantly changed structures, including breaking bow waves with splashes and bubbles at high Froude numbers. Instantaneous and mean flow features such as interface structures, vortex shedding, Reynolds stresses, and vorticity transport are also analyzed. The results are compared with reference experimental data available in the literature. The deep flow is also compared with the single-phase turbulent flow past a circular cylinder in the similar ranges of Reynolds numbers. Discussion is provided concerning the limitations of the current simulations and available experimental data along with future research.

Magnetic Signals Analysis for Vehicle Detection Sensor and Magnetic Field Shape (자기신호분석을 통한 차량의 감지센서와 자기형상에 관한 연구)

  • Choi, Hak-Yun
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.40 no.2
    • /
    • pp.349-354
    • /
    • 2015
  • This paper is about utilizing magnetic sensor to measure magnetic signal and analyze the form of magnetic signal for vehicle detection. For magnetic sensor, MR sensor from Honeywell company was used, and Helmholtz coil of which 3 axis' length is 1.2 m was manufactured to check the capability of the sensor and estimate its ability to detect the magnetic field. Vehicle detection was performed in following steps: installing sensor in road lane and non-road lane; estimating magnetic field when the vehicle is run by the driver; and estimating magnetic field of 7 different vehicles with different sizes. Also, sensor was installed at SUV and small-sized vehicle's park and non-park area to analyze the form of magnetic field. Lastly, the form of magnetic field made by different parts of the vehicle was analyzed. Based on the analysis, the form of magnetic field's magnetic peak value was bigger for road lane than non-road lane, complicated form was useful to distinguish the road lane above the installed sensor and the location of the running car, and the types of vehicle could be sorted because the variance of the magnetic field was bigger for bigger size of the vehicle. Also, it was confirmed that the forms of vehicle in parts-by-parts estimates.

The Cooperative Parallel X-Match Data Compression Algorithm (협동 병렬 X-Match 데이타 압축 알고리즘)

  • 윤상균
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.30 no.10
    • /
    • pp.586-594
    • /
    • 2003
  • X-Match algorithm is a lossless compression algorithm suitable for hardware implementation owing to its simplicity. It can compress 32 bits per clock cycle and is suitable for real time compression. However, as the bus width increases 64-bit, the compression unit also need to increase. This paper proposes the cooperative parallel X-Match (X-MatchCP) algorithm, which improves the compression speed by performing the two X-Match algorithms in parallel. It searches the all dictionary for two words, combines the compression codes of two words generated by parallel X-Match compression and outputs the combined code while the previous parallel X-Match algorithm searches an individual dictionary. The compression ratio in X-MatchCP is almost the same as in X-Match. X-MatchCP algorithm is described and simulated by Verilog hardware description language.

A Study on the Methodology of Land-Consolidation Sloping Paddies in land valley for the Farm-Mechanization (기계화를 전제로한 산간경사지답 경지정리 방안에 관한 연구)

  • Sung, Chan-Yong;Hwang, Eun;Han, Wook-Dong
    • Magazine of the Korean Society of Agricultural Engineers
    • /
    • v.23 no.2
    • /
    • pp.61-69
    • /
    • 1981
  • The paddies in the hillsides in Gonggeun-myeon, Hoingseong-gun, Kangweon-do keep a steep slope and run in and out. A land consolidation in such an area, therefore, will require high ratio of land loss and a large amount of earth moving if it follows the existing design criteria to separate drainage and irrigation ditches in a scheme. Due to the consequent decrease in construction cost, the project has not been envisaged. in order to secure the introduction of small-medium size farm machineries into the paddies, farm plots were planned to be straight and drainage with taking care of topography. Findings from the comparison of methodologies are as follows. 1. In places with a solpe of more than 1/30, a reduction in earth moving can be expected with parallel plots to the contours. 2, For the sake of effective using of farm machineries, it is thought that a plot should be running straight parallel to the contours and the ratio of length and width of a plot be more than six. 3. In places with a slop of more than 1/10, a reduction in earth moving and a effective introduction of farm-machineries can he expected with straight parllel plots to the contours. But it is undesirable to introduce a scheme in this places because of the difficulties in acreage computation and farmers' hesitation. 4. The system with a canal for both irrigation and drainage is highly effective to decrease the ratio of land loss as well as construction cost. 5. Parallel plots to the contours and a canal for both irrigation and drainage are highly effective in the decrease in construct cost. 6. To avoid the subdivision of a cooperation in farming is desirable of a plots, has more than two owners.

  • PDF

Exponentially Weighted Moving Average Chart for High-Yield Processes

  • Kotani, Takayuki;Kusukawa, Etsuko;Ohta, Hiroshi
    • Industrial Engineering and Management Systems
    • /
    • v.4 no.1
    • /
    • pp.75-81
    • /
    • 2005
  • Borror et al. discussed the EWMA(Exponentially Weighted Moving Average) chart to monitor the count of defects which follows the Poisson distribution, referred to the $EWMA_c$ chart, as an alternative Shewhart c chart. In the $EWMA_c$ chart, the Markov chain approach is used to calculate the ARL (Average Run Length). On the other hand, in order to monitor the process fraction defectives P in high-yield processes, Xie et al. presented the CCC(Cumulative Count of Conforming)-r chart of which quality characteristic is the cumulative count of conforming item inspected until observing $r({\geq}2)$ nonconforming items. Furthermore, Ohta and Kusukawa presented the $CS(Confirmation Sample)_{CCC-r}$ chart as an alternative of the CCC-r chart. As a more superior chart in high-yield processes, in this paper we present an $EWMA_{CCC-r}$ chart to detect more sensitively small or moderate shifts in P than the $CS_{CCC-r}$ chart. The proposed $EWMA_{CCC-r}$ chart can be constructed by applying the designing method of the $EWMA_C$ chart to the CCC-r chart. ANOS(Average Number of Observations to Signal) of the proposed chart is compared with that of the $CS_{CCC-r}$ chart through computer simulation. It is demonstrated from numerical examples that the performance of proposed chart is more superior to the $CS_{CCC-r}$ chart.

A PRML System for Perpendicular Magnetic Recording Channel in Wireless Multimedia Networks (무선 멀티미디어 네트워크에서 수직 자기기록장치를 위한 PRML 시스템)

  • Kim Jeong-so;Hwang Gi-yean
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.5 no.5
    • /
    • pp.454-457
    • /
    • 2004
  • Partial response maximum likelihood (PRML) is a powerful and indispensable detection scheme for perpendicular magnetic recording channels. The proposed method is a low complexity detection scheme which is related to the PRML system. The simulation results show that PR(1,2,3,4,3,2,1)ML and PR(l,2,3,3,2,1)ML using modulation encoding with R=2/3 have the most improved performance at K=3,4. However, in the case of K=3, R=2/3 PR(1,1,1,1)ML effectively reduces the complexity compared to PR(1,2,3,3,2,1), but it has L5dB performance degradation at most. In the case of K=4, R=l PR(1,2,2,1)ML has very low complexity compared to R=2/3 PR(l,2,3,4,3,2,1)ML. but it has about 2dB performance degradation at most.

  • PDF

Fast Algorithms for Computing the Shortest Path between Two Points inside a Simple Polygon (다각형 내부에 있는 두 점 사이의 최단 경로를 구하는 빠른 알고리즘)

  • Kim, Soo-Hwan;Lim, Intaek;Choi, Jinoh;Choi, Jinho
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2009.10a
    • /
    • pp.807-810
    • /
    • 2009
  • In this paper, we consider the shortest path problems in a simple polygon. The shortest path between two points inside a polygon P is a minimum-length path among all paths connecting them which don't pass by the exterior of P. A linear time algorithm for computing the shortest path in a general simple polygon requires triangulating a polygon as preprocessing. The linear time triangulating is known to very complex to understand and implement it. It is also inefficient in cases without very large input size. In this paper, we present the customized shortest path algorithms for specific polygon classes such as star-shaped polygons, edge-visible polygons, and monotone polygons. These algorithms need not triangulating as preprocessing, so they are simple and run very fast in linear time.

  • PDF

A Design of Economic CUSUM Control Chart Incorporating Quality Loss Function (품질손실을 고려한 경제적 CUSUM 관리도)

  • Kim, Jungdae
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.41 no.4
    • /
    • pp.203-212
    • /
    • 2018
  • Quality requirements of manufactured products or parts are given in the form of specification limits on the quality characteristics of individual units. If a product is to meet the customer's fitness for use criteria, it should be produced by a process which is stable or repeatable. In other words, it must be capable of operating with little variability around the target value or nominal value of the product's quality characteristic. In order to maintain and improve product quality, we need to apply statistical process control techniques such as histogram, check sheet, Pareto chart, cause and effect diagram, or control charts. Among those techniques, the most important one is control charting. The cumulative sum (CUSUM) control charts have been used in statistical process control (SPC) in industries for monitoring process shifts and supporting online measurement. The objective of this research is to apply Taguchi's quality loss function concept to cost based CUSUM control chart design. In this study, a modified quality loss function was developed to reflect quality loss situation where general quadratic loss curve is not appropriate. This research also provided a methodology for the design of CUSUM charts using Taguchi quality loss function concept based on the minimum cost per hour criterion. The new model differs from previous models in that the model assumes that quality loss is incurred even in the incontrol period. This model was compared with other cost based CUSUM models by Wu and Goel, According to numerical sensitivity analysis, the proposed model results in longer average run length in in-control period compared to the other two models.

A Development of JPEG-LS Platform for Mirco Display Environment in AR/VR Device. (AR/VR 마이크로 디스플레이 환경을 고려한 JPEG-LS 플랫폼 개발)

  • Park, Hyun-Moon;Jang, Young-Jong;Kim, Byung-Soo;Hwang, Tae-Ho
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.14 no.2
    • /
    • pp.417-424
    • /
    • 2019
  • This paper presents the design of a JPEG-LS codec for lossless image compression from AR/VR device. The proposed JPEG-LS(: LosSless) codec is mainly composed of a context modeling block, a context update block, a pixel prediction block, a prediction error coding block, a data packetizer block, and a memory block. All operations are organized in a fully pipelined architecture for real time image processing and the LOCO-I compression algorithm using improved 2D approach to compliant with the SBT coding. Compared with a similar study in JPEG-LS, the Block-RAM size of proposed STB-FLC architecture is reduced to 1/3 compact and the parallel design of the predication block could improved the processing speed.

Cone-beam computed tomography texture analysis can help differentiate odontogenic and non-odontogenic maxillary sinusitis

  • Andre Luiz Ferreira Costa;Karolina Aparecida Castilho Fardim;Isabela Teixeira Ribeiro;Maria Aparecida Neves Jardini;Paulo Henrique Braz-Silva;Kaan Orhan;Sergio Lucio Pereira de Castro Lopes
    • Imaging Science in Dentistry
    • /
    • v.53 no.1
    • /
    • pp.43-51
    • /
    • 2023
  • Purpose: This study aimed to assess texture analysis(TA) of cone-beam computed tomography (CBCT) images as a quantitative tool for the differential diagnosis of odontogenic and non-odontogenic maxillary sinusitis(OS and NOS, respectively). Materials and Methods: CBCT images of 40 patients diagnosed with OS (N=20) and NOS (N=20) were evaluated. The gray level co-occurrence (GLCM) matrix parameters, and gray level run length matrix texture (GLRLM) parameters were extracted using manually placed regions of interest on lesion images. Seven texture parameters were calculated using GLCM and 4 parameters using GLRLM. The Mann-Whitney test was used for comparisons between the groups, and the Levene test was performed to confirm the homogeneity of variance (α=5%). Results: The results showed statistically significant differences(P<0.05) between the OS and NOS patients regarding 3 TA parameters. NOS patients presented higher values for contrast, while OS patients presented higher values for correlation and inverse difference moment. Greater textural homogeneity was observed in the OS patients than in the NOS patients, with statistically significant differences in standard deviations between the groups for correlation, sum of squares, sum of entropy, and entropy. Conclusion: TA enabled quantitative differentiation between OS and NOS on CBCT images by using the parameters of contrast, correlation, and inverse difference moment.