• 제목/요약/키워드: Baseline Correction

Search Result 98, Processing Time 0.036 seconds

Effectiveness of Overnight Orthokeratology with a New Contact Lens Design in Moderate to High Myopia with Astigmatism

  • Park, Yuli;Kim, Hoon;Kang, Jae Ku;Cho, Kyong Jin
    • Medical Lasers
    • /
    • v.10 no.4
    • /
    • pp.229-237
    • /
    • 2021
  • Background and Objectives To assess the effectiveness of overnight orthokeratology (OK) in myopia using a new contact lens design over a one-month wearing period. Materials and Methods Participants were required to have myopia between -3.00 and -7.50D and astigmatism ≤ 2.00 D to participate in the study. The participants underwent OK with the White OK lens® (Interojo, Pyungtek, Korea), which has a 6-curve lens design. Participants were assessed at weeks 1, 2, and 4 using slit-lamp bio-microscopy, and tested for refraction, uncorrected distance visual acuity, and corneal topography. Success was defined as achieving a Logarithm of the Minimum Angle of Resolution (logMAR) ≤ 0.1. Results A total of 46 eligible subjects with a mean age of 23.11 ± 7.89 years were recruited. Baseline logMAR was 1.18 ± 0.30 and a consistent decrease in logMAR was observed from week 1 to week 4. The success rate was 95.35% at week 4. The mean sphere significantly decreased from a mean pre-fitting value of -4.58 ± 1.28 D to a mean value of -0.65 ± 0.69 D at week 4 (p < 0.0001). Statistically significant corneal flattening was detected during keratometry at week 4. Conclusion Overnight OK with the White OK lens is effective for the correction of moderate and high myopia with astigmatism over a one-month wearing period.

Masked language modeling-based Korean Data Augmentation Techniques Using Label Correction (정답 레이블을 고려한 마스킹 언어모델 기반 한국어 데이터 증강 방법론)

  • Myunghoon Kang;Jungseob Lee;Seungjun Lee;Hyeonseok Moon;Chanjun Park;Yuna Hur;Heuiseok Lim
    • Annual Conference on Human and Language Technology
    • /
    • 2022.10a
    • /
    • pp.485-490
    • /
    • 2022
  • 데이터 증강기법은 추가적인 데이터 구축 혹은 수집 행위 없이 원본 데이터셋의 양과 다양성을 증가시키는 방법이다. 데이터 증강기법은 규칙 기반부터 모델 기반 방법으로 발전하였으며, 최근에는 Masked Language Modeling (MLM)을 응용한 모델 기반 데이터 증강 연구가 활발히 진행되고 있다. 그러나 기존의 MLM 기반 데이터 증강 방법은 임의 대체 방식을 사용하여 문장 내 의미 변화 가능성이 큰 주요 토큰을 고려하지 않았으며 증강에 따른 레이블 교정방법이 제시되지 않았다는 한계점이 존재한다. 이러한 문제를 완화하기 위하여, 본 논문은 레이블을 고려할 수 있는 Re-labeling module이 추가된 MLM 기반 한국어 데이터 증강 방법론을 제안한다. 제안하는 방법론을 KLUE-STS 및 KLUE-NLI 평가셋을 활용하여 검증한 결과, 기존 MLM 방법론 대비 약 89% 적은 데이터 양으로도 baseline 성능을 1.22% 향상시킬 수 있었다. 또한 Gate Function 적용 여부 실험으로 제안 방법 Re-labeling module의 구조적 타당성을 검증하였다.

  • PDF

Coastline Changes in the Tumen River Estuary over the Past 35 Years Using the Landsat Satellite Imagery (LANDSAT 위성영상을 이용한 과거 35년간 두만강 하구 해안선 변화 연구)

  • Zhao, Yuwei;Zhao, Shuqing;Xu, Zhen;Lee, Dongkun
    • Journal of the Korean Society of Environmental Restoration Technology
    • /
    • v.27 no.3
    • /
    • pp.57-66
    • /
    • 2024
  • Coastline changes evolution of different intensities at all times under the influence of natural and anthropogenic effects. In this paper, we extracted the coastline of the Tumen River estuary from 1985 to 2020 using the digitizing method, verified the accuracy using the visual interpretation results, and analyzed the changes of the coastline of the Tumen River estuary through the area method and the baseline method. The results showed that the coastline showed an erosion trend during 35 years, with an average erosion rate of 0.05 m/year, an average erosion distance of 3.06 m, and an erosion area of 19.25 ha. Among the human activities that retarded the erosion of the coastline, these activities had a long-term impact on the natural morphology of the coastline.

Assessment of Climate Change Impact on Storage Behavior of Chungju and the Regulation Dams Using SWAT Model (SWAT을 이용한 기후변화가 충주댐 및 조정지댐 저수량에 미치는 영향 평가)

  • Jeong, Hyeon Gyo;Kim, Seong-Joon;Ha, Rim
    • Journal of Korea Water Resources Association
    • /
    • v.46 no.12
    • /
    • pp.1235-1247
    • /
    • 2013
  • This study is to evaluate the climate change impact on future storage behavior of Chungju dam($2,750{\times}10^6m^3$) and the regulation dam($30{\times}10^6m^3$) using SWAT(Soil Water Assessment Tool) model. Using 9 years data (2002~2010), the SWAT was calibrated and validated for streamflow at three locations with 0.73 average Nash-Sutcliffe model Efficiency (NSE) and for two reservoir water levels with 0.86 NSE respectively. For future evaluation, the HadCM3 of GCMs (General Circulation Models) data by scenarios of SRES (Special Report on Emission Scenarios) A2 and B1 of the IPCC (Intergovernmental Panel on Climate Change) were adopted. The monthly temperature and precipitation data (2007~2099) were spatially corrected using 30 years (1977~2006, baseline period) of ground measured data through bias-correction, and temporally downscaled by Change Factor (CF) statistical method. For two periods; 2040s (2031~2050), 2080s (2071~2099), the future annual temperature were predicted to change $+0.9^{\circ}C$ in 2040s and $+4.0^{\circ}C$ in 2080s, and annual precipitation increased 9.6% in 2040s and 20.7% in 2080s respectively. The future watershed evapotranspiration increased up to 15.3% and the soil moisture decreased maximum 2.8% compared to baseline (2002~2010) condition. Under the future dam release condition of 9 years average (2002~2010) for each dam, the yearly dam inflow increased maximum 21.1% for most period except autumn. By the decrease of dam inflow in future autumn, the future dam storage could not recover to the full water level at the end of the year by the present dam release pattern. For the future flood and drought years, the temporal variation of dam storage became more unstable as it needs careful downward and upward management of dam storage respectively. Thus it is necessary to adjust the dam release pattern for climate change adaptation.

Bias Correction for GCM Long-term Prediction using Nonstationary Quantile Mapping (비정상성 분위사상법을 이용한 GCM 장기예측 편차보정)

  • Moon, Soojin;Kim, Jungjoong;Kang, Boosik
    • Journal of Korea Water Resources Association
    • /
    • v.46 no.8
    • /
    • pp.833-842
    • /
    • 2013
  • The quantile mapping is utilized to reproduce reliable GCM(Global Climate Model) data by correct systematic biases included in the original data set. This scheme, in general, projects the Cumulative Distribution Function (CDF) of the underlying data set into the target CDF assuming that parameters of target distribution function is stationary. Therefore, the application of stationary quantile mapping for nonstationary long-term time series data of future precipitation scenario computed by GCM can show biased projection. In this research the Nonstationary Quantile Mapping (NSQM) scheme was suggested for bias correction of nonstationary long-term time series data. The proposed scheme uses the statistical parameters with nonstationary long-term trends. The Gamma distribution was assumed for the object and target probability distribution. As the climate change scenario, the 20C3M(baseline scenario) and SRES A2 scenario (projection scenario) of CGCM3.1/T63 model from CCCma (Canadian Centre for Climate modeling and analysis) were utilized. The precipitation data were collected from 10 rain gauge stations in the Han-river basin. In order to consider seasonal characteristics, the study was performed separately for the flood (June~October) and nonflood (November~May) seasons. The periods for baseline and projection scenario were set as 1973~2000 and 2011~2100, respectively. This study evaluated the performance of NSQM by experimenting various ways of setting parameters of target distribution. The projection scenarios were shown for 3 different periods of FF scenario (Foreseeable Future Scenario, 2011~2040 yr), MF scenario (Mid-term Future Scenario, 2041~2070 yr), LF scenario (Long-term Future Scenario, 2071~2100 yr). The trend test for the annual precipitation projection using NSQM shows 330.1 mm (25.2%), 564.5 mm (43.1%), and 634.3 mm (48.5%) increase for FF, MF, and LF scenarios, respectively. The application of stationary scheme shows overestimated projection for FF scenario and underestimated projection for LF scenario. This problem could be improved by applying nonstationary quantile mapping.

Analysis on Line-Of-Sight (LOS) Vector Projection Errors according to the Baseline Distance of GPS Orbit Errors (GPS 궤도오차의 기저선 거리에 따른 시선각 벡터 투영오차 분석)

  • Jang, JinHyeok;Ahn, JongSun;Bu, Sung-Chun;Lee, Chul-Soo;Sung, SangKyung;Lee, Young Jae
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.45 no.4
    • /
    • pp.310-317
    • /
    • 2017
  • Recently, many nations are operating and developing Global Navigation Satellite System (GNSS). Also, Satellite Based Augmentation System (SBAS), which uses the geostationary orbit, is operated presently in order to improve the performance of GNSS. The most widely-used SBAS is Wide Area Augmentation System (WAAS) of GPS developed by the United States. SBAS uses various algorithms to offer guaranteed accuracy, availability, continuity and integrity to its users. There is algorithm for guarantees the integrity of the satellite. This algorithm calculates the satellite errors, generates the correction and provides it to the users. The satellite orbit errors are calculated in three-dimensional space in this step. The reference placement is crucial for this three-dimensional calculation of satellite orbit errors. The wider the reference placement becomes, the wider LOS vectors spread, so the more the accuracy improves. For the next step, the regional features of the US and Korea need to be analyzed. Korea has a very narrow geographic features compared to the US. Hence, there may be a problem if the three-dimensional space method of satellite orbit error calculation is used without any modification. This paper suggests a method which uses scalar values to calculate satellite orbit errors instead of using three-dimensional space. Also, this paper proposes the feasibility for this method for a narrow area. The suggested method uses the scalar value, which is a projection of orbit errors on the LOS vector between a reference and a satellite. This method confirms the change in errors according to the baseline distance between Korea and America. The difference in the error change is compared to present the feasibility of the proposed method.

PM2.5 Simulations for the Seoul Metropolitan Area: (III) Application of the Modeled and Observed PM2.5 Ratio on the Contribution Estimation (수도권 초미세먼지 농도모사: (III) 관측농도 대비 모사농도 비율 적용에 따른 기여도 변화 검토)

  • Bae, Changhan;Yoo, Chul;Kim, Byeong-Uk;Kim, Hyun Cheol;Kim, Soontae
    • Journal of Korean Society for Atmospheric Environment
    • /
    • v.33 no.5
    • /
    • pp.445-457
    • /
    • 2017
  • In this study, we developed an approach to better account for uncertainties in estimated contributions from fine particulate matter ($PM_{2.5}$) modeling. Our approach computes a Concentration Correction Factor (CCF) which is a ratio of observed concentrations to baseline model concentrations. We multiply modeled direct contribution estimates with CCF to obtain revised contributions. Overall, the modeling system showed reasonably good performance, correlation coefficient R of 0.82 and normalized mean bias of 2%, although the model underestimated some PM species concentrations. We also noticed that model biases vary seasonally. We compared contribution estimates of major source sectors before and after applying CCFs. We observed that different source sectors showed variable magnitudes of sensitivities to the CCF application. For example, the total primary $PM_{2.5}$ contribution was increased $2.4{\mu}g/m^3$ or 63% after the CCF application. Out of a $2.4{\mu}g/m^3$ increment, line sources and area source made up $1.3{\mu}g/m^3$ and $0.9{\mu}g/m^3$ which is 92% of the total contribution changes. We postulated two major reasons for variations in estimated contributions after the CCF application: (1) monthly variability of unadjusted contributions due to emission source characteristics and (2) physico-chemical differences in environmental conditions that emitted precursors undergo. Since emissions-to-$PM_{2.5}$ concentration conversion rate is an important piece of information to prioritize control strategy, we examined the effects of CCF application on the estimated conversion rates. We found that the application of CCFs can alter the rank of conversion efficiencies of source sectors. Finally, we discussed caveats of our current approach such as no consideration of ion neutralization which warrants further studies.

Correction of Upper Lip Depression Using Conchal Cartilage Graft in Unilateral Cleft Lip Deformity (일측구순열변형에서 이갑개연골이식술을 이용한 상구순 함몰의 교정)

  • Han, Ki-Hwan;Yun, Sang-Ho;Yeo, Hyun-Jung;Kim, Jun-Hyung;Son, Dae-Gu
    • Archives of Plastic Surgery
    • /
    • v.38 no.4
    • /
    • pp.383-390
    • /
    • 2011
  • Purpose: To correct the upper lip depression after the correction of unilateral cleft lip, autologous grafts such as bone, dermal, fascial grafts and fat injections or alloplastic implants are used. Transplanted bones, dermis and fascia have a tendency to be absorbed and have donor morbidity. Fat injections are absorbed inconsistently and alloplastic implants have problems such as foreign body reactions, protrusions and infections. Authors corrected the upper lip depression using conchal cartilage graft in unilateral cleft lip deformity and the results was analysed with photos. Methods: 26-unilateral cleft lip and 2-microform cleft lip cases, totally 28 cases were performed. Their mean age was 21.89 years. The male and female cases were 12 and 16, respectively. Under anesthesia (general: 18 cases and local: 10 cases), cavum conchae (n=8), cymba conchae (n=16) and whole conchae (n=4) were harvested. Transversely cut the margin of the obtained cartilage, we cut out the most bent portion and put a partial-thickness incision on concave surface in cases of excessive convexity. Then, we performed the onlay graft of the conchal cartilage via scar revision site in unilateral cleft lip and via the reconstruction site of the cupid bow in microform cleft lip. The augmentation of the upper lip was evaluated with photos. Adapting the baseline connecting between the both cheilions as a horizontal standard line, we measured the highest point among the tangents between the upper lip and nose (point a), the lowest point (point c), the middle point between a and c (point b) and the vertical line from the alare (point d) to the horizontal standard line. To assess the postoperative symmetry, we compared cleft side upper lip contour index (%) A,B,C,D=(a,b,c,d)-ch ${\times}$ 100/(ch-ch) and non-cleft side upper lip contour index (%) A',B',C',D'= (a',b',c',d')-ch ${\times}$ 100 / (ch-ch).h) Results: After the surgery, no complication was found except in one case which double layers graft performed in the cleft lip deformity, the lateral portion was protruded. The upper lip contour index, the difference of A and A' were-0.83%, and thus the mild depression was persisted. Difference of B and B', C and C', D and D' were 0.83%, 1.07%, 0.90%. There were statistically significant difference, and thus the depression of upper lip were improved generally. Conclusion: Authors performed the onlay graft of the conchal cartilage in unilateral cleft lip deformity and found that the depression of the upper lip was well corrected except the uppermost part when photogrammetrically analyzed.

On the Improvement of Precision in Gravity Surveying and Correction, and a Dense Bouguer Anomaly in and Around the Korean Peninsula (한반도 일원의 중력측정 및 보정의 정밀화와 고밀도 부우게이상)

  • Shin, Young-Hong;Yang, Chul-Soo;Ok, Soo-Suk;Choi, Kwang-Sun
    • Journal of the Korean earth science society
    • /
    • v.24 no.3
    • /
    • pp.205-215
    • /
    • 2003
  • A precise and dense Bouguer anomaly is one of the most important data to improve the knowledge of our environment in the aspect of geophysics and physical geodesy. Besides the precise absolute gravity station net, we should consider two parts; one is to improve the precision in gravity measurement and correction of it, and the other is the density of measurement both in number and distribution. For the precise positioning, we have tested how we could use the GPS properly in gravity measurement, and deduced that the GPS measurement for 5 minutes would be effective when we used DGPS with two geodetic GPS receivers and the baseline was shorter than 40km. In this case we should use a precise geoid model such as PNU95. By applying this method, we are able to reduce the cost, time, and number of surveyors, furthermore we also get the benefit of improving in quality. Two kind of computer programs were developed to correct crossover errors and to calculate terrain effects more precisely. The repeated measurements on the same stations in gravity surveying are helpful not only to correct the drifts of spring but also to approach the results statistically by applying network adjustment. So we can find out the blunders of various causes easily and also able to estimate the quality of the measurements. The recent developments in computer technology, digital elevation data, and precise positioning also stimulate us to improve the Bouguer anomaly by more precise terrain correction. The gravity data of various sources, such as land gravity data (by Choi, NGI, etc.), marine gravity data (by NORI), Bouguer anomaly map of North Korea, Japanese gravity data, altimetry satellite data, and EGM96 geopotential model, were collected and processed to get a precise and dense Bouguer anomaly in and around the Korean Peninsula.

3D Accuracy Analysis of Mobile Phone-based Stereo Images (모바일폰 기반 스테레오 영상에서 산출된 3차원 정보의 정확도 분석)

  • Ahn, Heeran;Kim, Jae-In;Kim, Taejung
    • Journal of Broadcast Engineering
    • /
    • v.19 no.5
    • /
    • pp.677-686
    • /
    • 2014
  • This paper analyzes the 3D accuracy of stereo images captured from a mobile phone. For 3D accuracy evaluation, we have compared the accuracy result according to the amount of the convergence angle. In order to calculate the 3D model space coordinate of control points, we perform inner orientation, distortion correction and image geometry estimation. And the quantitative 3D accuracy was evaluated by transforming the 3D model space coordinate into the 3D object space coordinate. The result showed that relatively precise 3D information is generated in more than $17^{\circ}$ convergence angle. Consequently, it is necessary to set up stereo model structure consisting adequate convergence angle as an measurement distance and a baseline distance for accurate 3D information generation. It is expected that the result would be used to stereoscopic 3D contents and 3D reconstruction from images captured by a mobile phone camera.