• Title/Summary/Keyword: Software Size Estimation

Search Result 111, Processing Time 0.029 seconds

Effect of Arrangement of Design Elements on Recognition of Complex Signs

  • Ishihara, Maki;Okada, Akira;Yamashita, Kuniko
    • Journal of the Ergonomics Society of Korea
    • /
    • v.26 no.4
    • /
    • pp.143-146
    • /
    • 2007
  • Due to the expansion of cities and the increasing number of large-scale and complex public spaces, there is an increase in public signage. Moreover, the information described on these signs tends to be diverse and complicated. Complex signs that contain multiple destinations or other information must be considered to determine not only the proper size, color, etc. but also the most effective arrangement of design elements. In the previous research, the cognitive utility of complex public signs was estimated using computer simulation software. In the current research, we focused on the objective estimation of the effectiveness of the results obtained in the previous research utilizing an eye mark recording system. Two cognitive engineering experiments clarified five points for improvement in the usability of complex signs, as follows: 1) Parallel construction of characters and pictograms is more efficient. 2) Grouping elements result in rapid recognition of information chunks. 3) Visual characters and pictograms are effective, along with proper density of information. 4) Specific arrangement of sign arrows is effective. 5) Figures on signs influence the sequence of information searches.

Impacts of Ownership Structure on Systemic Risk of Listed Companies in Vietnam

  • VU, Van Thi Thuy;PHAN, Nghia Trong;DANG, Hung Ngoc
    • The Journal of Asian Finance, Economics and Business
    • /
    • v.7 no.2
    • /
    • pp.107-117
    • /
    • 2020
  • The research objective of the paper is to clarify the factors influencing system risks of listed companies in Vietnam, with a focus on clarifying the relationship and quantifying the impacts of ownership structure on systemic risk of listed companies. The data used in this study included financial statements and stock price data of listed companies on the Ho Chi Minh City Stock Exchange and Hanoi Stock Exchange of Vietnam stock market in the period from 2010 to 2017. The paper used the method of estimation in establising the regression models to choose among three models: Random Effect Model, Fixed Effect Model or Pooled OLS for regression using Stata statistical software. The research results showed that state ownership and ownership by foreign investors were positively related to systemic risk, while ownership by domestic investors had a reverse relationship with systemic risk of listed companies in Vietnam. In addition, as a control variable, both company size and profitability had an effect on the systemic risk of listed companies in the research sample. Based on the research results, the authors interpreted some of the implications in order to minimize systemic risks in the operation of listed companies in Vietnam.

A Case Study of Sustainable Potential of Rainwater System Development for Household Water Consumption in Nigeria (지속가능한 생활용 우수시스템 개발 사례)

  • Adelodun, Bashir;Choi, Kyung-Sook
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2018.05a
    • /
    • pp.485-485
    • /
    • 2018
  • Rainwater harvesting system (RWH) can provide a relief for the household and farmers especially in areas with intense water scarcity during the long lull of rainy season. However, much attention has not been given to this alternative water source in Nigeria. This paper estimates the per capita water demand for 1,950 inhabitants and rainwater potential in Ojonbodu Estate, Oyo State, Nigeria, using data from detailed questionnaires, water consumption calculator software, and 20-year rainfall data. The potential rainwater estimation was based on amount of precipitation, size of catchment and runoff coefficient. Consequently, using estimated values of $39420m^3$ and $6.5114{\times}10^7m^3$ for per capita consumption and potential rainwater respectively, the rainwater harvesting system was designed for rainwater collection, and storage. The harvested rainwater was $450, 000m^3$ with collection efficiency of 69.16 %, which exceeded the household water consumption requirement. Thus, the harvested rainwater was able to meet the estimated water demand of the Ojonbodu Estate households during the period of water scarcity.

  • PDF

Area-to-Area Poisson Kriging Analysis of Mapping of County-Level Esophageal Cancer Incidence Rates in Iran

  • Asmarian, Naeimeh Sadat;Ruzitalab, Ahmad;Amir, Kavousi;Masoud, Salehi;Mahaki, Behzad
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.14 no.1
    • /
    • pp.11-13
    • /
    • 2013
  • Background: Esophagus cancer, the third most common gastrointestinal cancer overall, demonstrates high incidence in parts of Iran. The counties of Iran vary in size, shape and population size. The aim of this study was to account for spatial support with Area-to-Area (ATA) Poisson Kriging to increase precision of parameter estimates and yield correct variance and create maps of disease rates. Materials and Methods: This study involved application/ecology methodology, illustrated using esophagus cancer data recorded by the Ministry of Health and Medical Education (in the Non-infectious Diseases Management Center) of Iran. The analysis focused on the 336 counties over the years 2003-2007. ATA was used for estimating the parameters of the map with SpaceStat and ArcGIS9.3 software for analysing the data and drawing maps. Results: Northern counties of Iran have high risk estimation. The ATA Poisson Kriging approach yielded variance increase in large sparsely populated counties. So, central counties had the most prediction variance. Conclusions: The ATAPoisson kriging approach is recommended for estimating parameters of disease mapping since this method accounts for spatial support and patterns in irregular spatial areas. The results demonstrate that the counties in provinces Ardebil, Mazandaran and Kordestan have higher risk than other counties.

Quantitative Methodology for Analyzing Propriety of Complement and Salary on Military Organization - Concentrating on Army Doctrine Research Institution - (군(軍) 내 민간인력 적정 규모 및 임금 분석을 위한 정량적 방법론 - 육군 교리업무조직을 중심으로 -)

  • Beak, Byungho;Kim, Yeekhyun;Lee, Yong-Bok;Min, Seunghee;Jee, Yonghoon
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.43 no.1
    • /
    • pp.34-41
    • /
    • 2020
  • There has not been any scientific analysis on appropriate size of workforce and salary for civilian workers in military so far. Thus, this paper conducted analysis on propriety in employment size of military doctrine researchers using system dynamic methodology based on annual military doctrine workload. Vensim software was mainly used to measure complement of the research group based on data from job analysis. Secondly, a multiple regression analysis was performed to study an appropriate wage for researchers based on their expertise and working condition. The data from twenty public research institutions and twenty eight job positions that are performing similar duty with military doctrine researchers was obtained and utilized to create a salary-estimation regression equation in the analysis. Finally, with cost-benefit analysis method this paper studied financial effectiveness of hiring military doctrine researchers. Contingent valuation method, which has been recognized as one of the most effective methodologies in cost-benefit analysis on intangible value, was utilized to measure benefit of hiring the researchers. The methodology presented in this paper can be applied to measure and improve the efficiency of military organization not only in military doctrine research area but also in several military functional area (military training, logistics, administration, combat development, and combat support).

Influence of the Alveolar Cleft Type on Preoperative Estimation Using 3D CT Assessment for Alveolar Cleft

  • Choi, Hang Suk;Choi, Hyun Gon;Kim, Soon Heum;Park, Hyung Jun;Shin, Dong Hyeok;Jo, Dong In;Kim, Cheol Keun;Uhm, Ki Il
    • Archives of Plastic Surgery
    • /
    • v.39 no.5
    • /
    • pp.477-482
    • /
    • 2012
  • Background The bone graft for the alveolar cleft has been accepted as one of the essential treatments for cleft lip patients. Precise preoperative measurement of the architecture and size of the bone defect in alveolar cleft has been considered helpful for increasing the success rate of bone grafting because those features may vary with the cleft type. Recently, some studies have reported on the usefulness of three-dimensional (3D) computed tomography (CT) assessment of alveolar bone defect; however, no study on the possible implication of the cleft type on the difference between the presumed and actual value has been conducted yet. We aimed to evaluate the clinical predictability of such measurement using 3D CT assessment according to the cleft type. Methods The study consisted of 47 pediatric patients. The subjects were divided according to the cleft type. CT was performed before the graft operation and assessed using image analysis software. The statistical significance of the difference between the preoperative estimation and intraoperative measurement was analyzed. Results The difference between the preoperative and intraoperative values were $-0.1{\pm}0.3cm^3$ (P=0.084). There was no significant intergroup difference, but the groups with a cleft palate showed a significant difference of $-0.2{\pm}0.3cm^3$ (P<0.05). Conclusions Assessment of the alveolar cleft volume using 3D CT scan data and image analysis software can help in selecting the optimal graft procedure and extracting the correct volume of cancellous bone for grafting. Considering the cleft type, it would be helpful to extract an additional volume of $0.2cm^3$ in the presence of a cleft palate.

A Meta Analysis of Using Structural Equation Model on the Korean MIS Research (국내 MIS 연구에서 구조방정식모형 활용에 관한 메타분석)

  • Kim, Jong-Ki;Jeon, Jin-Hwan
    • Asia pacific journal of information systems
    • /
    • v.19 no.4
    • /
    • pp.47-75
    • /
    • 2009
  • Recently, researches on Management Information Systems (MIS) have laid out theoretical foundation and academic paradigms by introducing diverse theories, themes, and methodologies. Especially, academic paradigms of MIS encourage a user-friendly approach by developing the technologies from the users' perspectives, which reflects the existence of strong causal relationships between information systems and user's behavior. As in other areas in social science the use of structural equation modeling (SEM) has rapidly increased in recent years especially in the MIS area. The SEM technique is important because it provides powerful ways to address key IS research problems. It also has a unique ability to simultaneously examine a series of casual relationships while analyzing multiple independent and dependent variables all at the same time. In spite of providing many benefits to the MIS researchers, there are some potential pitfalls with the analytical technique. The research objective of this study is to provide some guidelines for an appropriate use of SEM based on the assessment of current practice of using SEM in the MIS research. This study focuses on several statistical issues related to the use of SEM in the MIS research. Selected articles are assessed in three parts through the meta analysis. The first part is related to the initial specification of theoretical model of interest. The second is about data screening prior to model estimation and testing. And the last part concerns estimation and testing of theoretical models based on empirical data. This study reviewed the use of SEM in 164 empirical research articles published in four major MIS journals in Korea (APJIS, ISR, JIS and JITAM) from 1991 to 2007. APJIS, ISR, JIS and JITAM accounted for 73, 17, 58, and 16 of the total number of applications, respectively. The number of published applications has been increased over time. LISREL was the most frequently used SEM software among MIS researchers (97 studies (59.15%)), followed by AMOS (45 studies (27.44%)). In the first part, regarding issues related to the initial specification of theoretical model of interest, all of the studies have used cross-sectional data. The studies that use cross-sectional data may be able to better explain their structural model as a set of relationships. Most of SEM studies, meanwhile, have employed. confirmatory-type analysis (146 articles (89%)). For the model specification issue about model formulation, 159 (96.9%) of the studies were the full structural equation model. For only 5 researches, SEM was used for the measurement model with a set of observed variables. The average sample size for all models was 365.41, with some models retaining a sample as small as 50 and as large as 500. The second part of the issue is related to data screening prior to model estimation and testing. Data screening is important for researchers particularly in defining how they deal with missing values. Overall, discussion of data screening was reported in 118 (71.95%) of the studies while there was no study discussing evidence of multivariate normality for the models. On the third part, issues related to the estimation and testing of theoretical models on empirical data, assessing model fit is one of most important issues because it provides adequate statistical power for research models. There were multiple fit indices used in the SEM applications. The test was reported in the most of studies (146 (89%)), whereas normed-test was reported less frequently (65 studies (39.64%)). It is important that normed- of 3 or lower is required for adequate model fit. The most popular model fit indices were GFI (109 (66.46%)), AGFI (84 (51.22%)), NFI (44 (47.56%)), RMR (42 (25.61%)), CFI (59 (35.98%)), RMSEA (62 (37.80)), and NNFI (48 (29.27%)). Regarding the test of construct validity, convergent validity has been examined in 109 studies (66.46%) and discriminant validity in 98 (59.76%). 81 studies (49.39%) have reported the average variance extracted (AVE). However, there was little discussion of direct (47 (28.66%)), indirect, and total effect in the SEM models. Based on these findings, we suggest general guidelines for the use of SEM and propose some recommendations on concerning issues of latent variables models, raw data, sample size, data screening, reporting parameter estimated, model fit statistics, multivariate normality, confirmatory factor analysis, reliabilities and the decomposition of effects.

Color Inverse Halftoning using Vector Adaptive Filter (벡터적응필터를 이용한 컬러 역하프토닝)

  • Kim, Chan-Su;Kim, Yong-Hun;Yi, Tai-Hong
    • Journal of KIISE:Software and Applications
    • /
    • v.35 no.3
    • /
    • pp.162-168
    • /
    • 2008
  • A look-up table based vector adaptive filter is proposed in color inverse halftoning. Inverse halftoning converts halftone image into a continuous-tone image. The templates and training images are required in the process of look-up table based methods, which can be obtained from distributed patterns in the sample halftone images and their original images. Although the look-up table based methods usually are faster and show better performances in PSNR than other methods do, they show wide range of qualities depending on how they treat nonexisting patterns in the look-up table. In this paper, a vector adaptive filter is proposed to compensate for these nonexisting patterns, which achieves better quality owing to the contributed informations about hue, saturation, and intensity of surrounding pixels. The experimental results showed that the proposed method resulted in higher PSNR than that of conventional Best Linear Estimation method. The bigger the size of the template in the look-up table becomes, the more outstanding quality in the proposed method can be obtained.

Parallel Computing on Intensity Offset Tracking Using Synthetic Aperture Radar for Retrieval of Glacier Velocity

  • Hong, Sang-Hoon
    • Korean Journal of Remote Sensing
    • /
    • v.35 no.1
    • /
    • pp.29-37
    • /
    • 2019
  • Synthetic Aperture Radar (SAR) observations are powerful tools to monitor surface's displacement very accurately, induced by earthquake, volcano, ground subsidence, glacier movement, etc. Especially, radar interferometry (InSAR) which utilizes phase information related to distance from sensor to target, can generate displacement map in line-of-sight direction with accuracy of a few cm or mm. Due to decorrelation effect, however, degradation of coherence in the InSAR application often prohibit from construction of differential interferogram. Offset tracking method is an alternative approach to make a two-dimensional displacement map using intensity information instead of the phase. However, there is limitation in that the offset tracking requires very intensive computation power and time. In this paper, efficiency of parallel computing has been investigated using high performance computer for estimation of glacier velocity. Two TanDEM-X SAR observations which were acquired on September 15, 2013 and September 26, 2013 over the Narsap Sermia in Southwestern Greenland were collected. Atotal of 56 of 2.4 GHz Intel Xeon processors(28 physical processors with hyperthreading) by operating with linux environment were utilized. The Gamma software was used for application of offset tracking by adjustment of the number of processors for the OpenMP parallel computing. The processing times of the offset tracking at the 256 by 256 pixels of window patch size at single and 56 cores are; 26,344 sec and 2,055 sec, respectively. It is impressive that the processing time could be reduced significantly about thirteen times (12.81) at the 56 cores usage. However, the parallel computing using all the processors prevent other background operations or functions. Except the offset tracking processing, optimum number of processors need to be evaluated for computing efficiency.

Analytical Consideration of Surface Dose and Kerma for Megavoltage Photon Beams in Clinical Radiation Therapy

  • Birgani, Mohammad Javad Tahmasebi;Behrooz, Mohammad Ali;Razmjoo, Sasan;Zabihzadeh, Mansour;Fatahiasl, Jafar;Maskni, Reza;Abdalvand, Neda;Asgarian, Zeynab;Shamsi, Azin
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.17 no.1
    • /
    • pp.153-157
    • /
    • 2016
  • Background: In radiation therapy, estimation of surface doses is clinically important. This study aimed to obtain an analytical relationship to determine the skin surface dose, kerma and the depth of maximum dose, with energies of 6 and 18 megavoltage (MV). Materials and Methods: To obtain the dose on the surface of skin, using the relationship between dose and kerma and solving differential equations governing the two quantities, a general relationship of dose changes relative to the depth was obtained. By dosimetry all the standard square fields of $5cm{\times}5cm$ to $40cm{\times}40cm$, an equation similar to response to differential equations of the dose and kerma were fitted on the measurements for any field size and energy. Applying two conditions: a) equality of the area under dose distribution and kerma changes in versus depth in 6 and 18 MV, b) equality of the kerma and dose at $x=d_{max}$ and using these results, coefficients of the obtained analytical relationship were determined. By putting the depth of zero in the relation, amount of PDD and kerma on the surface of the skin, could be obtained. Results: Using the MATLAB software, an exponential binomial function with R-Square >0.9953 was determined for any field size and depth in two energy modes 6 and 18MV, the surface PDD and kerma was obtained and both of them increase due to the increase of the field, but they reduce due to increased energy and from the obtained relation, depth of maximum dose can be determined. Conclusions: Using this analytical formula, one can find the skin surface dose, kerma and thickness of the buildup region.