• Title/Summary/Keyword: frequency-based method

Search Result 6,110, Processing Time 0.033 seconds

Influence of the Education Service Quality and Result Expectations on Behavioral Intention: Focus on the TOEIC Business of a Global Company (교육서비스 품질과 교육성과의 기대일치여부가 행동의도에 미치는 영향: 글로벌기업의 TOEIC사업을 중심으로)

  • Kang, Ho-Gye;Song, In-Am;Hwang, Hee-Joong
    • Journal of Distribution Science
    • /
    • v.11 no.2
    • /
    • pp.71-81
    • /
    • 2013
  • Purpose - The TOEIC test has been leading the change in the quality and the globalization of companies for about last 30 years. The TOEIC test is taken by about two million people each year and is used as a criterion to select new employees in companies or government offices, for performance ratings, and for overseas posting selections. Universities also use TOEIC test in various ways. Since the TOEIC test is used for the selection of new students for admission, transferring extra credits, scholarships, graduation certification, and admission of international students studying abroad, many universities all over the country provide students with TOEIC study lectures through their own language institutes. Despite the fact, there has been no research on the service quality or even the current situation of these institutes. Thus, this study aims to evaluate the factors that impact TOEIC lecture service quality and analyzes the effect of the expectation related to the education service quality and the result of education on intentional behavior. Research design, data, methodology - Data was collected by administering a survey to current TOEIC students from different university language institutes. The survey questionnaire comprised of a five-point Likert scale. The demographic analysis was conducted using the frequency analysis method and the factor analysis was conducted to verify the validity of questionnaire over any variable. The reliability analysis was conducted to verify the reliability of the results. Besides, multiple regression analysis, regression analysis, and mediated effect verification were also conducted. For education service quality, four different independent variables such as reliability, response, conviction, and sympathy were considered using the SERVQUAL survey model. Based on the research models, the study hypotheses below were formulated in order to recognize an effect relationship between the variables. The four hypotheses are, "the hypothesis on education service quality and TOEIC study result expectation," "the hypothesis on education service quality and behavioral intention," "the hypothesis on study result expectation and behavioral intention," and "the hypothesis on study result expectation and mediated effect." Results - The results are as follows. First, the factors like response, conviction, and sympathy have a positive influence on TOEIC study result expectations. Second, the TOEIC study result expectation has a positive influence on the factors of behavioral intention such as re-sign up, positive word-of-mouth, "loyalty towards school." Third, it was verified that the mediated effect on behavioral intention was influenced by education service quality at university foreign language institute, while the study result expectation has only a partial mediated effect. Conclusions - The implications of this study are summarized as follows: First, it suggests a new research model for the effect of the expectation related to the education service quality and the result of education in the university language institutes on the behavioral intention. Second, it has established a relationship between the education service quality and study result expectation by verifying the mediated effect on them.

  • PDF

Generation of Grid Maps of GPS Signal Delays in the Troposphere and Analysis of Relative Point Positioning Accuracy Enhancement (GPS 신호의 대류권 지연정보 격자지도 생성과 상대측위 정확도 향상 평가)

  • Kim, Dusik;Won, Jihye;Son, Eun-Seong;Park, Kwan-Dong
    • Journal of Navigation and Port Research
    • /
    • v.36 no.10
    • /
    • pp.825-832
    • /
    • 2012
  • GPS signal delay that caused by dry gases and water vapor in troposphere is a main error source of GPS point positioning and it must be eliminated for precise point positioning. In this paper, we implemented to generate tropospheric delay grid map over the Korean Peninsula based on post-processing method by using the GPS permanent station network in order to determine the availability of tropospheric delay generation algorithm. GIPSY 5.0 was used for GPS data process and nationwide AWS observation network was used to calculate the amount of dry delay and wet delay separately. As the result of grid map's accuracy analysis, the RMSE between grid map data and GPS site data was 0.7mm in ZHD, 7.6mm in ZWD and 8.5mm in ZTD. After grid map accuracy analysis, we applied the calculated tropospheric delay grid map to single frequency relative positioning algorithm and analyzed the positioning accuracy enhancement. As the result, positioning accuracy was improved up to 36% in case of relative positioning of Suwon(SUWN) and Mokpo(MKPO), that the baseline distance is about 297km.

A Study on the Criteria for Collision Avoidance of Naval Ships for Obstacles in Constant Bearing, Decreasing Range (CBDR) (방위끌림이 없는 장애물에 대한 함정의 충돌회피 기준에 관한 연구)

  • Ha, Jeong-soo;Jeong, Yeon-hwan
    • Journal of Navigation and Port Research
    • /
    • v.43 no.6
    • /
    • pp.377-383
    • /
    • 2019
  • Naval ships that are navigating always have the possibility of colliding, but there is no clear maneuvering procedure for collision avoidance, and there is a tendency to depend entirely on the intuitive judgment of the Officer Of Watch (OOW). In this study, we conducted a questionnaire survey when and how to avoid collision for the OOW in a Constant Bearing, Decreasing Range (CBDR) situation wherein the naval ships encountered obstacles. Using the results of the questionnaire survey, we analyzed the CBDR situation of encountering obstacles, and how to avoid collision in day/night. The most difficult to maneuver areas were Pyeongtaek, Mokpo, and occurred mainly in narrow channels. The frequency appeared on average about once every four hours, and there were more of a large number of ships encountering situations than the 1:1 situation. The method of check of collision course confirmation was more reliable with the eye confirmation results, and priority was given to distance at closest point of approach (DCPA) and time at closest point of approach (TCPA). There was not a difference in DCPA between the give-way ship and stand-on ship, but a difference between day and night. Also, most navigators prefer to use maneuvering & shifting when avoiding collisions, and steering is 10-15°, shifting ±5knots, and the drift course was direction added stern of the obstacles to the direction of it. These results will facilitate in providing officers with standards for collision avoidance, and also apply to the development of AI and big data based unmanned ship collision avoidance algorithms.

Effective Picture Search in Lifelog Management Systems using Bluetooth Devices (라이프로그 관리 시스템에서 블루투스 장치를 이용한 효과적인 사진 검색 방법)

  • Chung, Eun-Ho;Lee, Ki-Yong;Kim, Myoung-Ho
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.16 no.4
    • /
    • pp.383-391
    • /
    • 2010
  • A Lifelog management system provides users with services to store, manage, and search their life logs. This paper proposes a fully-automatic collecting method of real world social contacts and lifelog search engine using collected social contact information as keyword. Wireless short-distance network devices in mobile phones are used to detect social contacts of their users. Human-Bluetooth relationship matrix is built based on the frequency of a human-being and a Bluetooth device being observed at the same time. Results show that with 20% of social contact information out of full social contact information of the observation times used for calculation, 90% of human-Bluetooth relationship can be correctly acquired. A lifelog search-engine that takes human names as keyword is suggested which compares two vectors, a row of Human-Bluetooth matrix and a vector of Bluetooth list scanned while a lifelog was created, using vector information retrieval model. This search engine returns more lifelog than existing text-matching search engine and ranks the result unlike existing search-engine.

Study on Enhancement of TRANSGUIDE Outlier Filter Method under Unstable Traffic Flow for Reliable Travel Time Estimation -Focus on Dedicated Short Range Communications Probes- (불안정한 교통류상태에서 TRANSGUIDE 이상치 제거 기법 개선을 통한 교통 통행시간 예측 향상 연구 -DSRC 수집정보를 중심으로-)

  • Khedher, Moataz Bellah Ben;Yun, Duk Geun
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.18 no.3
    • /
    • pp.249-257
    • /
    • 2017
  • Filtering the data for travel time records obtained from DSRC probes is essential for a better estimation of the link travel time. This study addresses the major deficiency in the performance of TRANSGUIDE in removing anomalous data. This algorithm is unable to handle unstable traffic flow conditions for certain time intervals, where fluctuations are observed. In this regard, this study proposes an algorithm that is capable of overcoming the weaknesses of TRANSGUIDE. If TRANSGUIDE fails to validate sufficient number of observations inside one time interval, another process specifies a new validity range based on the median absolute deviation (MAD), a common statistical approach. The proposed algorithm suggests the parameters, ${\alpha}$ and ${\beta}$, to consider the maximum allowed outlier within a one-time interval to respond to certain traffic flow conditions. The parameter estimation relies on historical data because it needs to be updated frequently. To test the proposed algorithm, the DSRC probe travel time data were collected from a multilane highway road section. Calibration of the model was performed by statistical data analysis through using cumulative relative frequency. The qualitative evaluation shows satisfactory performance. The proposed model overcomes the deficiency associated with the rapid change in travel time.

A Study on the quantitative measurement methods of MRTD and prediction of detection distance for Infrared surveillance equipments in military (군용 열영상장비 최소분해가능온도차의 정량적 측정 방법 및 탐지거리 예측에 관한 연구)

  • Jung, Yeong-Tak;Lim, Jae-Seong;Lee, Ji-Hyeok
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.18 no.5
    • /
    • pp.557-564
    • /
    • 2017
  • The purpose of the thermal imaging observation device mounted on the K's tank in the Republic of Korea military is to convert infrared rays into visual information to provide information about the environment under conditions of restricted visibility. Among the various performance indicators of thermal observation devices, such as the view, magnification, resolution, MTF, NETD, and Minimum Resolvable Temperature Difference (MRTD), the MRTD is the most important, because it can indicate both the spatial frequency and temperature resolvable. However, the standard method of measuring the MRTD in NATO contains many subjective factors. As the measurement result can vary depending on subjective factors such as the human eye, metal condition and measurement conditions, the MRTD obtained is not stable. In this study, these qualitative MRTD measurement systems are converted into quantitative indicators based on a gray scale using imaging processing. By converting the average of the gray scale differences of the black and white images into the MRTD, the mean values can be used to determine whether the performance requirements required by the defense specification are met. The (mean) value can also be used to discriminate between detection, recognition and identification and the detectable distance of the thermal equipment can be analyzed under various environmental conditions, such as altostratus, heavy rain and fog.

Application of Effective Regularization to Gradient-based Seismic Full Waveform Inversion using Selective Smoothing Coefficients (선택적 평활화 계수를 이용한 그래디언트기반 탄성파 완전파형역산의 효과적인 정규화 기법 적용)

  • Park, Yunhui;Pyun, Sukjoon
    • Geophysics and Geophysical Exploration
    • /
    • v.16 no.4
    • /
    • pp.211-216
    • /
    • 2013
  • In general, smoothing filters regularize functions by reducing differences between adjacent values. The smoothing filters, therefore, can regularize inverse solutions and produce more accurate subsurface structure when we apply it to full waveform inversion. If we apply a smoothing filter with a constant coefficient to subsurface image or velocity model, it will make layer interfaces and fault structures vague because it does not consider any information of geologic structures and variations of velocity. In this study, we develop a selective smoothing regularization technique, which adapts smoothing coefficients according to inversion iteration, to solve the weakness of smoothing regularization with a constant coefficient. First, we determine appropriate frequencies and analyze the corresponding wavenumber coverage. Then, we define effective maximum wavenumber as 99 percentile of wavenumber spectrum in order to choose smoothing coefficients which can effectively limit the wavenumber coverage. By adapting the chosen smoothing coefficients according to the iteration, we can implement multi-scale full waveform inversion while inverting multi-frequency components simultaneously. Through the successful inversion example on a salt model with high-contrast velocity structures, we can note that our method effectively regularizes the inverse solution. We also verify that our scheme is applicable to field data through the numerical example to the synthetic data containing random noise.

Mega Flood Simulation Assuming Successive Extreme Rainfall Events (연속적인 극한호우사상의 발생을 가정한 거대홍수모의)

  • Choi, Changhyun;Han, Daegun;Kim, Jungwook;Jung, Jaewon;Kim, Duckhwan;Kim, Hung Soo
    • Journal of Wetlands Research
    • /
    • v.18 no.1
    • /
    • pp.76-83
    • /
    • 2016
  • In recent, the series of extreme storm events were occurred by those continuous typhoons and the severe flood damages due to the loss of life and the destruction of property were involved. In this study, we call Mega flood for the Extreme flood occurred by these successive storm events and so we can have a hypothetical Mega flood by assuming that a extreme event can be successively occurred with a certain time interval. Inter Event Time Definition (IETD) method was used to determine the time interval between continuous events in order to simulate Mega flood. Therefore, the continuous extreme rainfall events are determined with IETD then Mega flood is simulated by the consecutive events : (1) consecutive occurrence of two historical extreme events, (2) consecutive occurrence of two design events obtained by the frequency analysis based on the historical data. We have shown that Mega floods by continuous extreme rainfall events were increased by 6-17% when we compared to typical flood by a single event. We can expect that flood damage caused by Mega flood leads to much greater than damage driven by a single rainfall event. The second increase in the flood caused by heavy rain is not much compared to the first flood caused by heavy rain. But Continuous heavy rain brings the two times of flood damage. Therefore, flood damage caused by the virtual Mega flood of is judged to be very large. Here we used the hypothetical rainfall events which can occur Mega floods and this could be used for preparing for unexpected flood disaster by simulating Mega floods defined in this study.

Determination of drought events considering the possibility of relieving drought and estimation of design drought severity (가뭄해갈 가능성을 고려한 가뭄사상의 결정 및 확률 가뭄심도 산정)

  • Yoo, Ji Young;Yu, Ji Soo;Kwon, Hyun-Han;Kim, Tae-Woong
    • Journal of Korea Water Resources Association
    • /
    • v.49 no.4
    • /
    • pp.275-282
    • /
    • 2016
  • The objective of this study is to propose a new method to determine the drought event and the design drought severity. In order to define a drought event from precipitation data, theory of run was applied with the cumulative rainfall deficit. When we have a large amount of rainfall over the threshold level, in this study, we compare with the previous cumulative rainfall deficit to determine whether the drought is relieved or not. The recurrence characteristics of the drought severity on the specific duration was analyzed by the conditional bivariate copula function and confidence intervals were estimated to quantify uncertainties. The methodology was applied to Seoul station with the historical dataset (1909~2015). It was observed that the past droughts considered as extreme hydrological events had from 10 to 50 years of return period. On the other hand, the current on-going drought event started from 2013 showed the significantly higher return period. It is expected that the result of this study may be utilized as the reliable criteria based on the concept of return period for the drought contingency plan.

Application of Terahertz Spectroscopy and Imaging in the Diagnosis of Prostate Cancer

  • Zhang, Ping;Zhong, Shuncong;Zhang, Junxi;Ding, Jian;Liu, Zhenxiang;Huang, Yi;Zhou, Ning;Nsengiyumva, Walter;Zhang, Tianfu
    • Current Optics and Photonics
    • /
    • v.4 no.1
    • /
    • pp.31-43
    • /
    • 2020
  • The feasibility of the application of terahertz electromagnetic waves in the diagnosis of prostate cancer was examined. Four samples of incomplete cancerous prostatic paraffin-embedded tissues were examined using terahertz spectral imaging (TPI) system and the results obtained by comparing the absorption coefficient and refractive index of prostate tumor, normal prostate tissue and smooth muscle from one of the paraffin tissue masses examined were reported. Three hundred and sixty cases of absorption coefficients from one of the paraffin tissues examined were used as raw data to classify these three tissues using the Principal Component Analysis (PCA) and Least Squares Support Vector Machine (LS-SVM). An excellent classification with an accuracy of 92.22% in the prediction set was achieved. Using the distribution information of THz reflection signal intensity from sample surface and absorption coefficient of the sample, an attempt was made to use the TPI system to identify the boundaries of the different tissues involved (prostate tumors, normal and smooth muscles). The location of three identified regions in the terahertz images (frequency domain slice absorption coefficient imaging, 1.2 THz) were compared with those obtained from the histopathologic examination. The tissue tumor region had a distinctively visible color and could well be distinguished from other tissue regions in terahertz images. Results indicate that a THz spectroscopy imaging system can be efficiently used in conjunction with the proposed advanced computer-based mathematical analysis method to identify tumor regions in the paraffin tissue mass of prostate cancer.