• Title/Summary/Keyword: Monte Carlo techniques

Search Result 211, Processing Time 0.022 seconds

Kennicutt-Schmidt law with H I velocity profile decomposition in NGC 6822

  • Park, Hye-Jin;Oh, Se-Heon;Wang, Jing;Zheng, Yun;Zhang, Hong-Xin;de Blok, W.J.G.
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.46 no.1
    • /
    • pp.32.3-33
    • /
    • 2021
  • We present H I gas kinematics and star formation activities of NGC 6822, a dwarf galaxy located in the Local Volume at a distance of ~ 490 kpc. We perform profile decomposition of the line-of-sight velocity profiles of the high-resolution (42.4" × 12" spatial; 1.6 km/s spectral) H I data cube taken with the Australia Telescope Compact Array (ATCA). For this, we use a new tool, the so-called BAYGAUD (BAYesian GAUssian Decompositor) which is based on Bayesian Markov Chain Monte Carlo (MCMC) techniques, allowing us to decompose a line-of-sight velocity profile into an optimal number of Gaussian components in a quantitative manner. We classify the decomposed H I gas components of NGC 6822 into bulk-narrow, bulk-broad, and non_bulk with respect to their velocity and velocity dispersion. We correlate their gas surface densities with the surface star formation rates derived using both GALEX far-ultraviolet and WISE 22 micron data to examine the impact of gas turbulence caused by stellar feedback on the Kennicutt-Schmidt (K-S) law. The bulk-narrow component that resides within r25 is likely to follow the linear extension of the Kennicutt-Schmidt (K-S) law for molecular hydrogen (H2) at the low gas surface density regime where H I is not saturated.

  • PDF

Reliability-based approach for fragility assessment of bridges under floods

  • Raj Kamal Arora;Swagata Banerjee
    • Structural Engineering and Mechanics
    • /
    • v.88 no.4
    • /
    • pp.311-322
    • /
    • 2023
  • Riverine flood is one of the critical natural threats to river-crossing bridges. As floods are the most-occurred natural hazard worldwide, survival probability of bridges due to floods must be assessed in a speedy but precise manner. In this regard, the paper presents a reliability-based approach for a rapid assessment of failure probability of vulnerable bridge components under floods. This robust method is generic in nature and can be applied to both concrete and steel girder bridges. The developed methodology essentially utilizes limit state performance functions, expressed in terms of capacity and flood demand, for probable failure modes of various vulnerable components of bridges. Advanced First Order Reliability Method (AFORM), Monte Carlo Simulation (MCS), and Latin Hypercube Simulation (LHS) techniques are applied for the purpose of reliability assessment and developing flood fragility curves of bridges in which flow velocity and water height are taken as flood intensity measures. Upon validating the proposed method, it is applied to a case study bridge that experiences the flood scenario of a river in Gujarat, India. Research outcome portrays how effectively and efficiently the proposed reliability-based method can be applied for a quick assessment of flood vulnerability of bridges in any flood-prone region of interest.

Feasibility study of spent fuel internal tomography (SFIT) for partial defect detection within PWR spent nuclear fuel

  • Hyung-Joo Choi;Hyojun Park;Bo-Wi Cheon;Hyun Joon Choi;Hakjae Lee;Yong Hyun Chung;Chul Hee Min
    • Nuclear Engineering and Technology
    • /
    • v.56 no.6
    • /
    • pp.2412-2420
    • /
    • 2024
  • The International Atomic Energy Agency (IAEA) mandates safeguards to ensure non-proliferation of nuclear materials. Among inspection techniques used to detect partial defects within spent nuclear fuel (SNF), gamma emission tomography (GET) has been reported to be reliable for detection of partial defects on a pin-by-pin level. Conventional GET, however, is limited by low detection efficiency due to the high density of nuclear fuel rods and self-absorption. This paper proposes a new type of GET named Spent Fuel Internal Tomography (SFIT), which can acquire sinograms at the guide tube. The proposed device consists of the housing, shielding, C-shaped collimator, reflector, and gadolinium aluminum gallium garnet (GAGG) scintillator. For accurate attenuation correction, the source-distinguishable range of the SFIT device was determined using MC simulation to the region away from the proposed device to the second layer. For enhanced inspection accuracy, a proposed specific source-discrimination algorithm was applied. With this, the SFIT device successfully distinguished all source locations. The comparison of images of the existing and proposed inspection methods showed that the proposed method, having successfully distinguished all sources, afforded a 150 % inspection accuracy improvement.

The Design of Optimal Filters in Vector-Quantized Subband Codecs (벡터양자화된 부대역 코덱에서 최적필터의 구현)

  • 지인호
    • The Journal of the Acoustical Society of Korea
    • /
    • v.19 no.1
    • /
    • pp.97-102
    • /
    • 2000
  • Subband coding is to divide the signal frequency band into a set of uncorrelated frequency bands by filtering and then to encode each of these subbands using a bit allocation rationale matched to the signal energy in that subband. The actual coding of the subband signal can be done using waveform encoding techniques such as PCM, DPCM and vector quantizer(VQ) in order to obtain higher data compression. Most researchers have focused on the error in the quantizer, but not on the overall reconstruction error and its dependence on the filter bank. This paper provides a thorough analysis of subband codecs and further development of optimum filter bank design using vector quantizer. We compute the mean squared reconstruction error(MSE) which depends on N the number of entries in each code book, k the length of each code word, and on the filter bank coefficients. We form this MSE measure in terms of the equivalent quantization model and find the optimum FIR filter coefficients for each channel in the M-band structure for a given bit rate, given filter length, and given input signal correlation model. Specific design examples are worked out for 4-tap filter in 2-band paraunitary filter bank structure. These optimum paraunitary filter coefficients are obtained by using Monte Carlo simulation. We expect that the results of this work could be contributed to study on the optimum design of subband codecs using vector quantizer.

  • PDF

Material Decomposition through Weighted Image Subtraction in Dual-energy Spectral Mammography with an Energy-resolved Photon-counting Detector using Monte Carlo Simulation (몬테카를로 시뮬레이션을 이용한 광자계수검출기 기반 이중에너지 스펙트럼 유방촬영에서 가중 영상 감산법을 통한 물질분리)

  • Eom, Jisoo;Kang, Sooncheol;Lee, Seungwan
    • Journal of radiological science and technology
    • /
    • v.40 no.3
    • /
    • pp.443-451
    • /
    • 2017
  • Mammography is commonly used for screening early breast cancer. However, mammographic images, which depend on the physical properties of breast components, are limited to provide information about whether a lesion is malignant or benign. Although a dual-energy subtraction technique decomposes a certain material from a mixture, it increases radiation dose and degrades the accuracy of material decomposition. In this study, we simulated a breast phantom using attenuation characteristics, and we proposed a technique to enable the accurate material decomposition by applying weighting factors for the dual-energy mammography based on a photon-counting detector using a Monte Carlo simulation tool. We also evaluated the contrast and noise of simulated breast images for validating the proposed technique. As a result, the contrast for a malignant tumor in the dual-energy weighted subtraction technique was 0.98 and 1.06 times similar than those in the general mammography and dual-energy subtraction techniques, respectively. However the contrast between malignant and benign tumors dramatically increased 13.54 times due to the low contrast of a benign tumor. Therefore, the proposed technique can increase the material decomposition accuracy for malignant tumor and improve the diagnostic accuracy of mammography.

A Study on Photon Dose Calculation in 6 MV Linear Accelerator Based on Monte Carlo Method (몬테카를로 방법에 의한 6 MV 선형가속기의 광자 흡수선량 분포 평가에 관한 연구)

  • Kang, Sang-Koo;Ahn, Sung-Hwan;Kim, Chong-Yeal
    • Journal of radiological science and technology
    • /
    • v.34 no.1
    • /
    • pp.43-50
    • /
    • 2011
  • In this study we modeled the varian 2100C/D linear accelerator head and multi-leaf collimator by simulation with the GEANT4 Monte Carlo toolkit. Then central axis percentage depth dose profiles and lateral dose profiles within homogeneous water phantom($50{\times}50{\times}50\;cm^3$) were evaluated with 6 MV photon beam. The simulations were performed in two stages. In the first stage, photon energy spectrum at the target were computed were computed. Then spectra data was directly irradiated in the water phantom using sampling techniques. The simulation data were compared with experimental data to evaluate the accuracy of the model. Results showed that two data were matched within 2% error boundary. The proposed method will be applied for simulation of dose calculation and dose distribution study.

LLR-based Cooperative ARQ Protocol in Rayleigh Fading Channel (레일리 페이딩 채널에서 LLR 기반의 협력 ARQ 프로토콜)

  • Choi, Dae-Kyu;Kong, Hyung-Yun
    • Journal of the Institute of Electronics Engineers of Korea TC
    • /
    • v.45 no.4
    • /
    • pp.31-37
    • /
    • 2008
  • Conventional cooperative communications can attain gain of spatial diversity and path loss reduction because destination node independently received same signal from source node and relay node located between source node and destination node. However, these techniques bring about decreased spectral efficiency with relay node and increased complexity of receiver by using maximal ratio combining (MRC). This paper has proposed cooperative ARQ protocol that can improve the above problems and can get the better performance. This method can increase the spectral efficiency than conventional cooperative communication because if the received signal from source node is satisfied by the destination preferentially, the destination transmits ACK message to both relay node and source node and then recovers the received signal. In addition, if ARQ message indicates NACK relay node operates selective retransmission and we can increase reliability of system compared with that of general ARQ protocol in which source node retransmits data. In the proposed protocol, the selective retransmission and ARQ message are to be determined by comparing log-likelihood ratio (LLR) computation of received signal from source node with predetermined threshold values. Therefore, this protocol don't waste redundant bandwidth with CRC code and can reduce complexity of receiver without MRC. We verified spectral efficiency and BER performance for the proposed protocol through Monte-Carlo simulation over Rayleigh fading plus AWGN.

Reliability Assessment Based on an Improved Response Surface Method (개선된 응답면기법에 의한 신뢰성 평가)

  • Cho, Tae Jun;Kim, Lee Hyeon;Cho, Hyo Nam
    • Journal of Korean Society of Steel Construction
    • /
    • v.20 no.1
    • /
    • pp.21-31
    • /
    • 2008
  • response surface method (RSM) is widely used to evaluate th e extremely smal probability of ocurence or toanalyze the reliability of very complicated structures. Althoug h Monte-Carlo Simulation (MCS) technique can evaluate any system, the procesing time of MCS dependson the reciprocal num ber of the probability of failure. The stochastic finite element method could solve thislimitation. However, it is limit ed to the specific program, in which the mean and coeficient o f random variables are programed by a perturbation or by a weigh ted integral method. Therefore, it is not aplicable when erequisite programing. In a few number of stage analyses, RSM can construct a regresion model from the response of the c omplicated structural system, thus, saving time and efort significantly. However, the acuracy of RSM depends on the dist ance of the axial points and on the linearity of the limit stat e functions. To improve the convergence in exact solution regardl es of the linearity limit of state functions, an improved adaptive response surface method is developed. The analyzed res ults have ben verified using linear and quadratic forms of response surface functions in two examples. As a result, the be st combination of the improved RSM techniques is determined and programed in a numerical code. The developed linear adapti ve weighted response surface method (LAW-RSM) shows the closest converged reliability indices, compared with quadratic form or non-adaptive or non-weighted RSMs.

Analysis of Photon Spectrum for the use of Added Filters using 3D Printing Materials (3D 프린팅 재료를 이용한 X-선 부가 여과 시 광자 스펙트럼에 대한 분석)

  • Cho, Yong-In;Lee, Sang-Ho
    • Journal of the Korean Society of Radiology
    • /
    • v.16 no.1
    • /
    • pp.15-23
    • /
    • 2022
  • 3D printing technology is being used in various fields such as medicine and biotechnology, and materials containing metal powder are being commercialized through recent material development. Therefore, this study intends to analyze the photon spectrum during added filtration using 3D printing material during diagnostic X-ray examination through simulation. Among the Monte Carlo techniques, MCNPX (ver. 2.5.0) was used. First, the appropriateness of the photon spectrum generated in the simulation was evaluated through SRS-78 and SpekCalc, which are X-ray spectrum generation programs in the diagnostic field. Second, photon spectrum the same thickness of Al and Cu filters were obtained for characterization of 3D printing materials containing metal powder. In addition, the total photon fluence and average energy according to changes in tube voltage were compared and analyzed. As a result, it was analyzed that PLA-Al required about 1.2 ~ 1.4 times the thickness of the existing Al filter, and PLA-Cu required about 1.4 ~ 1.7 times the thickness of the Cu filter to show the same degree of filtration. Based on this study in the future, it is judged that it can be utilized as basic data for manufacturing 3D printing additional filters in medical fields.

A Comparative Study of Simplified Probabilistic Analysis Methods for Plane Failure of Rock Slope (암반사면의 평면파괴해석을 위한 간이 확률론적 해석 비교연구)

  • Kim, Youngmin
    • Tunnel and Underground Space
    • /
    • v.31 no.5
    • /
    • pp.360-373
    • /
    • 2021
  • Many sources of uncertainty exist in geotechnical analysis ranging from the material parameters to the sampling and testing techniques. The conventional deterministic stability analysis of a plane failure in rock slope produce a safety factor but not a probability of failure or reliability index. In the conventional slope stability analysis by evaluating the ground uncertainty as an overall safety factor, it is difficult to evaluate the stability of the realistic rock slope in detail. This paper reviews some established probabilistic analysis techniques, such as the MCS, FOSM, PEM, Taylor Series as applied to plane failure of rock slopes in detail. While the Monte - Carlo methods leads to the most accurate calculation of the probability of safety, this method is too time consuming. Therefore, the simplified probability methods could be alternatives to the MCS. In this study, using these simple probability methods, the failure probability estimation of a plane failure in rock slope is presented.