• Title/Summary/Keyword: Monte Carlo techniques

Search Result 211, Processing Time 0.024 seconds

Performance Analysis of CoMP with Scheduling and Precoding Techniques in the HetNet System (이종 네트워크에서 스케줄링 및 프리코딩을 결합한 다중 포인트 협력 통신 기술의 성능 분석)

  • Kim, Bora;Moon, Sangmi;Malik, Saransh;Kim, Cheolsung;Hwang, Intae
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.8
    • /
    • pp.45-52
    • /
    • 2013
  • Coordinated Multi-Point (CoMP) is considered as a technology in the 3rd Generation Partnership Project(3GPP) Long Term Evolution-Advanced (LTE-A) system. In this paper, we design and analyze the performance of the Coordinated Beamforming (CB) technique, which is one major category of CoMP. We perform Monte Carlo simulations with a Heterogeneous Network (HetNet) in LTE-A, and confirm the performance through a graph of the Cumulative Distribution Function (CDF). From the simulation results, we show significant performance gain with the CoMP technique, and better performance when we apply various schemes of scheduling and precoding.

Dynamic Resource Reservation for Ultra-low Latency IoT Air-Interface Slice

  • Sun, Guolin;Wang, Guohui;Addo, Prince Clement;Liu, Guisong;Jiang, Wei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.7
    • /
    • pp.3309-3328
    • /
    • 2017
  • The application of Internet of Things (IoT) in the next generation cellular networks imposes a new characteristic on the data traffic, where a massive number of small packets need to be transmitted. In addition, some emerging IoT-based emergency services require a real-time data delivery within a few milliseconds, referring to as ultra-low latency transmission. However, current techniques cannot provide such a low latency in combination with a mice-flow traffic. In this paper, we propose a dynamic resource reservation schema based on an air-interface slicing scheme in the context of a massive number of sensors with emergency flows. The proposed schema can achieve an air-interface latency of a few milliseconds by means of allowing emergency flows to be transported through a dedicated radio connection with guaranteed network resources. In order to schedule the delay-sensitive flows immediately, dynamic resource updating, silence-probability based collision avoidance, and window-based re-transmission are introduced to combine with the frame-slotted Aloha protocol. To evaluate performance of the proposed schema, a probabilistic model is provided to derive the analytical results, which are compared with the numerical results from Monte-Carlo simulations.

Bayesian Method for Modeling Male Breast Cancer Survival Data

  • Khan, Hafiz Mohammad Rafiqullah;Saxena, Anshul;Rana, Sagar;Ahmed, Nasar Uddin
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.15 no.2
    • /
    • pp.663-669
    • /
    • 2014
  • Background: With recent progress in health science administration, a huge amount of data has been collected from thousands of subjects. Statistical and computational techniques are very necessary to understand such data and to make valid scientific conclusions. The purpose of this paper was to develop a statistical probability model and to predict future survival times for male breast cancer patients who were diagnosed in the USA during 1973-2009. Materials and Methods: A random sample of 500 male patients was selected from the Surveillance Epidemiology and End Results (SEER) database. The survival times for the male patients were used to derive the statistical probability model. To measure the goodness of fit tests, the model building criterions: Akaike Information Criteria (AIC), Bayesian Information Criteria (BIC), and Deviance Information Criteria (DIC) were employed. A novel Bayesian method was used to derive the posterior density function for the parameters and the predictive inference for future survival times from the exponentiated Weibull model, assuming that the observed breast cancer survival data follow such type of model. The Markov chain Monte Carlo method was used to determine the inference for the parameters. Results: The summary results of certain demographic and socio-economic variables are reported. It was found that the exponentiated Weibull model fits the male survival data. Statistical inferences of the posterior parameters are presented. Mean predictive survival times, 95% predictive intervals, predictive skewness and kurtosis were obtained. Conclusions: The findings will hopefully be useful in treatment planning, healthcare resource allocation, and may motivate future research on breast cancer related survival issues.

Sputtering of Solid Surfaces at Ion Bombardment

  • Kang, Hee-Jae
    • Proceedings of the Korean Vacuum Society Conference
    • /
    • 1998.02a
    • /
    • pp.20-20
    • /
    • 1998
  • I Ion beam technology has recently attracted much interest because it has exciting t technological p아:ential for surface analysis, ion beam mixing, surface cleaning and etching i in thin film growth and semiconductor fabrication processes, etc. Es야~cially, ion beam s sputtering has been widely used for sputter depth profiling with x-photoelectron S spectroscopy (XPS) , Auger electron s$\pi$~troscopy(AES), and secondary-ion mass S야i따oscopy(SIMS). However, The problem of surface compositional ch없1ge due to ion b bombardment remains to be understo여 없ld solved. So far sputtering processes have been s studied by s따face an외ysis tools such as XPS, AES, and SIMS which use the sputtering p process again. It would be improbable to measure the modified surface composition profiles a accurately due to ion beam bombardment with surface analysis techniques based on sputter d depth profiling. However, recently Medium energy ion scattering spectroscopy(MEIS) has b been applied to study the sputtering of solid surface at ion bombardment and has been p proved that it has been extremely valuable in probing the surface composition 뻐d s structure nondestructively and quantita디vely with less than 1.0 nm depth resolution. To u understand the sputtering processes of solid surface at ion bombardment, The Molecular D Dynamics(MD) and Monte Carlo(MC) simulation has been used and give an intimate i insight into the sputtering processes of solid surfaces. In this presentation, the sputtering processes of alloys and compound samples at ion b bombardment will be reviewed and the MEIS results for the Ar+ sputter induced altered l layer of the TazOs thin film 뻐dd없nage profiling of Ar+ ion sputt얹"ed Si(100) surface will b be discussed with the results of MD and MC simulation.tion.

  • PDF

APOLLO2 YEAR 2010

  • Sanchez, Richard;Zmijarevi, Igor;Coste-Delclaux, M.;Masiello, Emiliano;Santandrea, Simone;Martinolli, Emanuele;Villate, Laurence;Schwartz, Nadine;Guler, Nathalie
    • Nuclear Engineering and Technology
    • /
    • v.42 no.5
    • /
    • pp.474-499
    • /
    • 2010
  • This paper presents the mostortant developments implemented in the APOLLO2 spectral code since its last general presentation at the 1999 M&C conference in Madrid. APOLLO2 has been provided with new capabilities in the domain of cross section self-shielding, including mixture effects and transfer matrix self-shielding, new or improved flux solvers (CPM for RZ geometry, heterogeneous cells for short MOC and the linear-surface scheme for long MOC), improved acceleration techniques ($DP_1$), that are also applied to thermal and external iterations, and a number of sophisticated modules and tools to help user calculations. The method of characteristics, which took over the collision probability method as the main flux solver of the code, allows for whole core two-dimensional heterogeneous calculations. A flux reconstruction technique leads to fast albeit accurate solutions used for industrial applications. The APOLLO2 code has been integrated (APOLLO2-A) within the $ARCADIA^{(R)}$ reactor code system of AREVA as cross section generator for PWR and BWR fuel assemblies. APOLLO2 is also extensively used by Electricite de France within its reactor calculation chain. A number of numerical examples are presented to illustrate APOLLO2 accuracy by comparison to Monte Carlo reference calculations. Results of the validation program are compared to the measured values on power plants and critical experiments.

Evaluating Applicability of SRTM DEM (Shuttle Radar Topography Mission Digital Elevation Model) in Hydrologic Analysis: A Case Study of Geum River and Daedong River Areas (수문인자추출에서의 SRTM DEM (Shuttle Radar Topography Mission Digital Elevation Model) 적용성 평가: 대동강 및 금강 지역 사례연구)

  • Her, Younggu;Yoo, Seung-Hwan
    • Journal of The Korean Society of Agricultural Engineers
    • /
    • v.55 no.6
    • /
    • pp.101-112
    • /
    • 2013
  • Shuttle Radar Topography Mission Digital Elevation Model (SRTM DEM) offers opportunities to make advances in many research areas including hydrology by providing near-global scale elevation measurements at a uniform resolution. Its wide coverage and complimentary online access especially benefits researchers requiring topographic information of hard-to-access areas. However, SRTM DEM also contains inherent errors, which are subject to propagation with its manipulation into analysis outputs. Sensitivity of hydrologic analysis to the errors has not been fully understood yet. This study investigated their impact on estimation of hydrologic derivatives such as slope, stream network, and watershed boundary using Monte Carlo simulation and spatial moving average techniques. Different amount of the errors and their spatial auto-correlation structure were considered in the study. Two sub-watersheds of Geum and Deadong River areas located in South and North Korea, respectively, were selected as the study areas. The results demonstrated that the spatial presentations of stream networks and watershed boundaries and their length and area estimations could be greatly affected by the SRTM DEM errors, in particular relatively flat areas. In the Deadong River area, artifacts of the SRTM DEM created sinks even after the filling process and then closed drainage basin and short stream lines, which are not the case in the reality. These findings provided an evidence that SRTM DEM alone may not enough to accurately figure out the hydrologic feature of a watershed, suggesting need of local knowledge and complementary data.

Dose and Image Evaluations of Imaging for Radiotherapy (방사선치료를 위한 영상장비의 선량 및 영상 평가)

  • Lee, Hyounggun;Yoon, Changyeon;Kim, Tae Jun;Kim, Dongwook;Chung, Weon Kyu;Park, Sung Ho;Lee, Wonho
    • Progress in Medical Physics
    • /
    • v.23 no.4
    • /
    • pp.292-302
    • /
    • 2012
  • The patient dose in advanced radiotherapy techniques is an important issue. These methods should be evaluated to reduce the dose in diagnostic imaging for radiotherapy. Especially, the Computed Tomography in radiotherapy has been used widely; hence the CT was evaluated for dose and image in this study. The evaluations for dose and image were done in equal condition due to compare the dose and image simultaneously. Furthermore, the possibility of dose and image evaluations by using the Monte Carlo simulation MCNPX was confirmed. We made the iterative reconstruction for low dose CT image to elevate image quality with Maximum Likelihood Expectation Maximization; MLEM. The system we developed is expected to be used not only to reduce the patient dose in radiotherapy, also to evaluate the overall factors of image modalities in industrial research.

A hidden Markov model for long term drought forecasting in South Korea

  • Chen, Si;Shin, Ji-Yae;Kim, Tae-Woong
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2015.05a
    • /
    • pp.225-225
    • /
    • 2015
  • Drought events usually evolve slowly in time and their impacts generally span a long period of time. This indicates that the sequence of drought is not completely random. The Hidden Markov Model (HMM) is a probabilistic model used to represent dependences between invisible hidden states which finally result in observations. Drought characteristics are dependent on the underlying generating mechanism, which can be well modelled by the HMM. This study employed a HMM with Gaussian emissions to fit the Standardized Precipitation Index (SPI) series and make multi-step prediction to check the drought characteristics in the future. To estimate the parameters of the HMM, we employed a Bayesian model computed via Markov Chain Monte Carlo (MCMC). Since the true number of hidden states is unknown, we fit the model with varying number of hidden states and used reversible jump to allow for transdimensional moves between models with different numbers of states. We applied the HMM to several stations SPI data in South Korea. The monthly SPI data from January 1973 to December 2012 was divided into two parts, the first 30-year SPI data (January 1973 to December 2002) was used for model calibration and the last 10-year SPI data (January 2003 to December 2012) for model validation. All the SPI data was preprocessed through the wavelet denoising and applied as the visible output in the HMM. Different lead time (T= 1, 3, 6, 12 months) forecasting performances were compared with conventional forecasting techniques (e.g., ANN and ARMA). Based on statistical evaluation performance, the HMM exhibited significant preferable results compared to conventional models with much larger forecasting skill score (about 0.3-0.6) and lower Root Mean Square Error (RMSE) values (about 0.5-0.9).

  • PDF

A novel radioactive particle tracking algorithm based on deep rectifier neural network

  • Dam, Roos Sophia de Freitas;dos Santos, Marcelo Carvalho;do Desterro, Filipe Santana Moreira;Salgado, William Luna;Schirru, Roberto;Salgado, Cesar Marques
    • Nuclear Engineering and Technology
    • /
    • v.53 no.7
    • /
    • pp.2334-2340
    • /
    • 2021
  • Radioactive particle tracking (RPT) is a minimally invasive nuclear technique that tracks a radioactive particle inside a volume of interest by means of a mathematical location algorithm. During the past decades, many algorithms have been developed including ones based on artificial intelligence techniques. In this study, RPT technique is applied in a simulated test section that employs a simplified mixer filled with concrete, six scintillator detectors and a137Cs radioactive particle emitting gamma rays of 662 keV. The test section was developed using MCNPX code, which is a mathematical code based on Monte Carlo simulation, and 3516 different radioactive particle positions (x,y,z) were simulated. Novelty of this paper is the use of a location algorithm based on a deep learning model, more specifically a 6-layers deep rectifier neural network (DRNN), in which hyperparameters were defined using a Bayesian optimization method. DRNN is a type of deep feedforward neural network that substitutes the usual sigmoid based activation functions, traditionally used in vanilla Multilayer Perceptron Networks, for rectified activation functions. Results show the great accuracy of the DRNN in a RPT tracking system. Root mean squared error for x, y and coordinates of the radioactive particle is, respectively, 0.03064, 0.02523 and 0.07653.

A Development of Nurse Scheduling Model Based on Q-Learning Algorithm

  • JUNG, In-Chul;KIM, Yeun-Su;IM, Sae-Ran;IHM, Chun-Hwa
    • Korean Journal of Artificial Intelligence
    • /
    • v.9 no.1
    • /
    • pp.1-7
    • /
    • 2021
  • In this paper, We focused the issue of creating a socially problematic nurse schedule. The nurse schedule should be prepared in consideration of three shifts, appropriate placement of experienced workers, the fairness of work assignment, and legal work standards. Because of the complex structure of the nurse schedule, which must reflect various requirements, in most hospitals, the nurse in charge writes it by hand with a lot of time and effort. This study attempted to automatically create an optimized nurse schedule based on legal labor standards and fairness. We developed an I/O Q-Learning algorithm-based model based on Python and Web Application for automatic nurse schedule. The model was trained to converge to 100 by creating an Fairness Indicator Score(FIS) that considers Labor Standards Act, Work equity, Work preference. Manual nurse schedules and this model are compared with FIS. This model showed a higher work equity index of 13.31 points, work preference index of 1.52 points, and FIS of 16.38 points. This study was able to automatically generate nurse schedule based on reinforcement Learning. In addition, as a result of creating the nurse schedule of E hospital using this model, it was possible to reduce the time required from 88 hours to 3 hours. If additional supplementation of FIS and reinforcement Learning techniques such as DQN, CNN, Monte Carlo Simulation and AlphaZero additionally utilize a more an optimized model can be developed.