• Title/Summary/Keyword: run-time

Search Result 1,927, Processing Time 0.033 seconds

Optimization of theM/M/1 Queue with Impatient Customers

  • Lee, Eui-Yong;Lim, Kyung-Eun
    • International Journal of Reliability and Applications
    • /
    • v.3 no.4
    • /
    • pp.165-171
    • /
    • 2002
  • An optimization of the M/M/1 queue with impatient customers is studied. The impatient customer does not enter the system if his or her virtual waiting time exceeds the threshold K > 0. After assigning three costs to the system, a cost proportional to the virtual waiting time, a penalty to each impatient customer, and also a penalty to each unit of the idle period of the server, we show that there exists a threshold K which minimizes the long-run average cost per unit time.

  • PDF

Evaluation of Runoff Loads and Computing of Contribute ratio by First Flush Stormwater from Cheongyang-Hongseong Road (청양-홍성간 도로에서의 초기강우에 의한 유출부하량 평가 및 기여율 산정)

  • Lee, Chun-Won;Kang, Seon-Hong;Choi, I-Song;An, Tae-Ung
    • Journal of Korean Society of Water and Wastewater
    • /
    • v.25 no.3
    • /
    • pp.407-417
    • /
    • 2011
  • Nowadays, the high land use, mainly used for urbanization, is affecting runoff loads of non-point pollutants to increase. According to this fact, increasing runoff loads seems like to appear that it contributes to high ratio of pollution loads in the whole the pollution loads and that this non-point source is the main cause of water becoming worse quality. Especially, concentrated pollutants on the impermeable roads run off to the public water bodies. Also the coefficient of runoff from roads is high with a fast velocity of runoff, which ends up with consequence that a lot of pollutants runoff happens when it is raining. Therefore it is very important project to evaluate the quantity of pollutant loads. In this study, I computed the pollutant loadings depending on time and rainfall to analyze characteristics of runoff while first flush storm water and evaluated the runoff time while first flush storm water and rainfall based on the change in curves on the graph. I also computed contribution ratio to identify its impact on water quality of stream. I realized that the management and treatment of first flush storm water effluents is very important for the management of road's non-point source pollutants because runoff loads of non-point source pollution are over the 80% of whole loads of stream. Also according to the evaluation of runoff loads of first flush storm water for SS, run off time was shown under the 30 minute and rainfall was shown under the 5mm which is less than 20% of whole rainfall. These are under 5mm which is regarded amount of first flush storm water by the Ministry of Environment and it is judged to be because run off by rainfall is very fast on impermeable roads. Also, run off time and rainfall of BOD is higher than SS. Therefore I realized that the management of non-point source should be managed and done differently depending on each material. Finally, the contribution ratio of pollutants loads by rainfall-runoff was shown SS 12.7%, BOD 12.7%, COD 15.9%, T-N 4.9%, T-P 8.9%, however, the pollutants loads flowing into the steam was shown 4.4%. This represents that the concentration of non-point pollutants is relatively higher and we should find the methodical management and should be concerned about non-point source for improvement on water quality of streams.

Mock Galaxy Catalogs from the Horizon Run 4 Simulation with the Most Bound Halo Particle - Galaxy orrespondence Method

  • Hong, Sungwook E.;Park, Changbom;Kim, Juhan
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.40 no.2
    • /
    • pp.29.3-30
    • /
    • 2015
  • We introduce an advanced one-to-one galaxy correspondence method that populates dark matter halos with galaxies by tracing merging histories of most bound member particles (MBPs) identified in simulated virialized halos. To estimate the survival time of a satellite galaxy, we adopt several models of tidal-destruction time derived from an analytic calculation, isolated galaxy simulations, and cosmological simulations. We build mock galaxy samples for each model by using a merging tree information of MBPs from our new Horizon Run 4 N-body simulation from z = 12 to 0. For models of galaxy survival time derived from cosmological and isolated galaxy simulations, about 40% of satellites galaxies merged into a certain halo are survived until z = 0. We compare mock galaxy samples from our MBP-galaxy correspondence scheme and the subhalo-galaxy scheme with SDSS volume-limited galaxy samples around z = 0 with $M_r-5{\log}h$ < -21 and -20. Compared to the subhalo-galaxy correspondence method, our method predicts more satellite galaxies close to their host halo center and larger pairwise peculiar velocity of galaxies. As a result, our method reproduces the observed galaxy group mass function, the number of member galaxies, and the two-point correlation functions while the subhalo-galaxy correspondence method underestimates them.

  • PDF

Analysis of Perchlorate in Water Using Ion Chromatograph with Preconcentration (이온크로마토그래프를 이용한 수중의 퍼클로레이트 농축 및 분석)

  • Kim, Hak-Chul
    • Journal of environmental and Sanitary engineering
    • /
    • v.21 no.4 s.62
    • /
    • pp.29-38
    • /
    • 2006
  • This study included the development of analytical method for determining perchlorate in water sample. The analytical condition was referred in EPA 314.0 method which use ion chromatography and the concentrator column was replaced by the guard column. Concentrating 10mL raw or treated water sample on to AGl6 guard column made it possible to get the LOD(Limit of Detection) of $0.73\;{\mu}g/L$. The total run time was 11 minutes and during run time next sample could be concentrated on AGl6 guard column. Compared to the Concentration method which needed manual operation, the Direct Injection method could screen the many water samples. The LOD of the Direct Injection method was higher and the sensitivity was lower than that of the Concentration method. The RSDs(Relative Standard Deviations) were lower than 2.5 % for peak height and 0.7 % for retention time in pre-concentration methods. This method Showed good reproducibility and reliability and it was thought the deviations of recovery value could be reduced by considering column capacity and making water sample homogeneous. Matrix Elimination could be done using the pre-concentration method if perchlorate were in complex matrix of sample.

A Study on the Measurement of Back Muscle Fatigue During Dynamic Contraction Using Multiple Parameters (다중 파라메터를 이용한 동적 수축시 허리 근육 피로 측정에 관한 연구)

  • Yoon, Jung-Gun;Jung, Chul-Ki;Yeo, Song-Phil;Kim, Sung-Hwan
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.55 no.7
    • /
    • pp.344-351
    • /
    • 2006
  • The fatigue of back muscle in the repetitive lifting motion was studied using multiple parameters(FFT_MDF, RMS, 2C, NT) in this study. Recent developments in the time-frequency analysis procedures to compute the IMDF(instantaneous median frequency) were utilized to overcome the nonstationarity of EMG signal using Cohen-Posch distribution. But the above method has a lot of computation time because of its complexity. So, in this study, FFT_MDF(median frequency estimation based on FFT) algorithm was used for median frequency estimation of back muscle EMG signal during muscle work in uniform velocity portion of lumbar movement. The analysis period of EMG signal was determined by using the run test and lumbar movement angle in dynamic task, such as lifting. Results showed that FFT_MDF algorithm is well suited for the estimation of back muscle fatigue from the view point of computation time. The negative slope of a regression line fitted to the median frequency values of back muscle EMG signal was taken as an indication of muscle fatigue. The slope of muscle fatigueness with FFT_MDF method shows the similarity of 77.8% comparing with CP_MDF(median frequency estimation based on Cohen Posch distribution) method.

Efficient Hardware Implementation of Real-time Rectification using Adaptively Compressed LUT

  • Kim, Jong-hak;Kim, Jae-gon;Oh, Jung-kyun;Kang, Seong-muk;Cho, Jun-Dong
    • JSTS:Journal of Semiconductor Technology and Science
    • /
    • v.16 no.1
    • /
    • pp.44-57
    • /
    • 2016
  • Rectification is used as a preprocessing to reduce the computation complexity of disparity estimation. However, rectification also requires a complex computation. To minimize the computing complexity, rectification using a lookup-table (R-LUT) has been introduced. However, since, the R-LUT consumes large amount of memory, rectification with compressed LUT (R-CLUT) has been introduced. However, the more we reduce the memory consumption, the more we need decoding overhead. Therefore, we need to attain an acceptable trade-off between the size of LUT and decoding overhead. In this paper, we present such a trade-off by adaptively combining simple coding methods, such as differential coding, modified run-length coding (MRLE), and Huffman coding. Differential coding is applied to transform coordinate data into a differential form in order to further improve the coding efficiency along with Huffman coding for better stability and MRLE for better performance. Our experimental results verified that our coding scheme yields high performance with maintaining robustness. Our method showed about ranging from 1 % to 16 % lower average inverse of compression ratio than the existing methods. Moreover, we maintained low latency with tolerable hardware overhead for real-time implementation.

Load Balancing Based on Transform Unit Partition Information for High Efficiency Video Coding Deblocking Filter

  • Ryu, Hochan;Park, Seanae;Ryu, Eun-Kyung;Sim, Donggyu
    • ETRI Journal
    • /
    • v.39 no.3
    • /
    • pp.301-309
    • /
    • 2017
  • In this paper, we propose a parallelization method for a High Efficiency Video Coding (HEVC) deblocking filter with transform unit (TU) split information. HEVC employs a deblocking filter to boost perceptual quality and coding efficiency. The deblocking filter was designed for data-level parallelism. In this paper, we demonstrate a method of distributing equal workloads to all cores or threads by anticipating the deblocking filter complexity based on the coding unit depth and TU split information. We determined that the average time saving of our proposed deblocking filter parallelization method has a speed-up factor that is 2% better than that of the uniformly distributed parallel deblocking filter, and 6% better than that of coding tree unit row distribution parallelism. In addition, we determined that the speed-up factor of our proposed deblocking filter parallelization method, in terms of percentage run-time, is up to 3.1 compared to the run-time of the HEVC test model 12.0 deblocking filter with a sequential implementation.

Optimization of photo-catalytic degradation of oil refinery wastewater using Box-Behnken design

  • Tetteh, Emmanuel Kweinor;Naidoo, Dushen Bisetty;Rathilal, Sudesh
    • Environmental Engineering Research
    • /
    • v.24 no.4
    • /
    • pp.711-717
    • /
    • 2019
  • The application of advanced oxidation for the treatment of oil refinery wastewater under UV radiation by using nanoparticles of titanium dioxide was investigated. Synthetic wastewater prepared from phenol crystals; Power Glide SAE40 motor vehicle oil and water was used. Response surface methodology (RSM) based on the Box-Behnken design was employed to design the experimental runs, optimize and study the interaction effects of the operating parameters including catalyst concentration, run time and airflow rate to maximize the degradation of oil (SOG) and phenol. The analysis of variance and the response models developed were used to evaluate the data obtained at a 95% confidence level. The use of the RSM demonstrated the graphical relationship that exists between individual factors and their interactive effects on the response, as compared to the one factor at time approach. The obtained optimum conditions of photocatalytic degradation are the catalyst concentration of 2 g/L, the run time of 30 min and the airflow rate of 1.04 L/min. Under the optimum conditions, a 68% desirability performance was obtained, representing 81% and 66% of SOG and phenol degradability, respectively. Thus, the hydrocarbon oils were readily degradable, while the phenols were more resistant to photocatalytic degradation.

A Multimedia Authoring System Supporting Dynamic Presentations (동적 프리젠테이션을 지원하는 멀티미디어 저작 시스템)

  • Choi, Sook-Young;Shin, Hyun-San;Yoo, Kwan-Jong
    • The Transactions of the Korea Information Processing Society
    • /
    • v.7 no.2
    • /
    • pp.328-336
    • /
    • 2000
  • This paper presents a multimedia authoring system in which user can create multimedia documents more easily and dynamic presentations can be supported. Our system defines a new time relation based on causal relation, so it controls effectively presentations when durations of media are changed in run-time. It supports dynamic authoring to feedback the consistency that could be occurred when users authorize multimedia documents. In our system, a multimedia document is represented to internal structure using trees through parsing phases, and a presentation engine is provided for processing dynamic presentations and user interactions in run-time.

  • PDF

Storage systems using RLE compression (RLE 압축 기법을 이용한 저장 시스템)

  • Kim, Kyeong-Og;Kim, Jong-Chan;Ban, Kyeong-Jin;Heo, Su-Yeon;Kim, Eung-Kon
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2010.05a
    • /
    • pp.686-688
    • /
    • 2010
  • The supply of context information is increasing with the propagation of ubiquitous computing environment. Recently, as context information is being collected through electronic tags and sensors attached to the environment, we need methods to efficiently store and search large volumes of data. This paper describes the application of the RLE (Run Length Encoding) compression method for sensors that continuously collect data in USN/RFID terminals.Time information is marked on the data and one data block is generated and saved. This paper proposes a storage method that allows us to quickly search data of the desired time and place by recording time information in continuous data.

  • PDF