• Title/Summary/Keyword: Mapping error

Search Result 449, Processing Time 0.029 seconds

Kinestatic Control using a Compliant Device by Fuzzy Logic (퍼지 논리에 의한 순응기구의 위치/힘 동시제어)

  • Seo, Jeong-Wook;Choi, Yong-Je
    • Proceedings of the KSME Conference
    • /
    • 2004.04a
    • /
    • pp.917-922
    • /
    • 2004
  • As the tasks of robots become more diverse, some complicated tasks have come to require force and position hybrid control. A compliant device can be used to control force and position simultaneously by separating the twist of the robot's end effector from the twist of compliance and freedom by using stiffness mapping of the compliant device. The development of a fuzzy gain scheduling scheme of control for a robot with a compliant device is described in this paper. Fuzzy rules and reasoning are performed on-line to determine the gain of twists based on wrench error and twist error and twist of compliance and twist of freedom ratio. Simulation results demonstrate that better control performance can be achieved in comparison with constant gain control.

  • PDF

A Critical Design Method of the Space-Based SARP Using RDA (RDA사용 위성기반 SARP 주요설계기법)

  • Hong, In-Pyo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.31 no.1C
    • /
    • pp.46-54
    • /
    • 2006
  • The design method of synthetic aperture radar processor (SARP) in the critical design stage is to describe the processing algorithm, to estimate the fractional errors, and to set out the software (SW) and hardware (HW) mapping. The previous design methods for SARP are complex and depend on HW. Therefore, this paper proposes a critical design method that is of more general and independent of HW. This methodology can be applied for developing the space-based SARP using range-Doppler algorithm (RDA).

The use of digital imaging and laser scanning technologies in rock engineering

  • Kemeny John;Monte Jamie;Handy Jeff;Thiam Samba
    • 한국지구물리탐사학회:학술대회논문집
    • /
    • 2003.11a
    • /
    • pp.35-41
    • /
    • 2003
  • Rock mass characterization is an integral part of rock engineering design. Much of the information for rock mass characterization comes from field fracture mapping and data collecting. This paper describes two technologies that can be used to assist with the field mapping and data collecting activities associated with rock mass characterization: digital image processing and 3D laserscanning. The basis for these techniques is described, as well as the results of field case studies and an analysis of the error in estimating fracture orientation.

  • PDF

Cross-Enrichment of the Heterogenous Ontologies Through Mapping Their Conceptual Structures: the Case of Sejong Semantic Classes and KorLexNoun 1.5 (이종 개념체계의 상호보완방안 연구 - 세종의미부류와 KorLexNoun 1.5 의 사상을 중심으로)

  • Bae, Sun-Mee;Yoon, Ae-Sun
    • Language and Information
    • /
    • v.14 no.1
    • /
    • pp.165-196
    • /
    • 2010
  • The primary goal of this paper is to propose methods of enriching two heterogeneous ontologies: Sejong Semantic Classes (SJSC) and KorLexNoun 1.5 (KLN). In order to achieve this goal, this study introduces the pros and cons of two ontologies, and analyzes the error patterns found during the fine-grained manual mapping processes between them. Error patterns can be classified into four types: (1) structural defectives involved in node branching, (2) errors in assigning the semantic classes, (3) deficiency in providing linguistic information, and (4) lack of the lexical units representing specific concepts. According to these error patterns, we propose different solutions in order to correct the node branching defectives and the semantic class assignment, to complement the deficiency of linguistic information, and to increase the number of lexical units suitably allotted to their corresponding concepts. Using the results of this study, we can obtain more enriched ontologies by correcting the defects and errors in each ontology, which will lead to the enhancement of practicality for syntactic and semantic analysis.

  • PDF

Extraction of Highway′s Superelevation Using GPS Real Time Kinematic Surveying (GPS 실시간 동적측위법을 이용한 도로 편경사 추출)

  • 서동주;장호식;이종출
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.20 no.2
    • /
    • pp.183-190
    • /
    • 2002
  • This study is about the extraction of highway's superelevation using real time kinematic surveying among of GPS surveying methods which is economic method to construct data base in the side of highway maintain management. Using the developed vehicle, center line and shoulder of highway are measured and enough precision is obtained after analyzing the result. The result is show that 1.3 cm to 2.0 cm error in the clothed and about 0.8 cm to 1.2 cm error in the circular curve. Those errors are proved error to lane making during construction. This study is expected to become efficient method for extraction of highway alignment elements in the Mobile Mapping System.

A copula based bias correction method of climate data

  • Gyamfi Kwame Adutwum;Eun-Sung Chung
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2023.05a
    • /
    • pp.160-160
    • /
    • 2023
  • Generally, Global Climate Models (GCM) cannot be used directly due to their inherent error arising from over or under-estimation of climate variables compared to the observed data. Several bias correction methods have been devised to solve this problem. Most of the traditional bias correction methods are one dimensional as they bias correct the climate variables separately. One such method is the Quantile Mapping method which builds a transfer function based on the statistical differences between the GCM and observed variables. Laux et al. introduced a copula-based method that bias corrects simulated climate data by employing not one but two different climate variables simultaneously and essentially extends the traditional one dimensional method into two dimensions. but it has some limitations. This study uses objective functions to address specifically, the limitations of Laux's methods on the Quantile Mapping method. The objective functions used were the observed rank correlation function, the observed moment function and the observed likelihood function. To illustrate the performance of this method, it is applied to ten GCMs for 20 stations in South Korea. The marginal distributions used were the Weibull, Gamma, Lognormal, Logistic and the Gumbel distributions. The tested copula family include most Archimedean copula families. Five performance metrics are used to evaluate the efficiency of this method, the Mean Square Error, Root Mean Square Error, Kolmogorov-Smirnov test, Percent Bias, Nash-Sutcliffe Efficiency and the Kullback Leibler Divergence. The results showed a significant improvement of Laux's method especially when maximizing the observed rank correlation function and when maximizing a combination of the observed rank correlation and observed moments functions for all GCMs in the validation period.

  • PDF

Evaluation of Various Tone Mapping Operators for Backward Compatible JPEG Image Coding

  • Choi, Seungcheol;Kwon, Oh-Jin;Jang, Dukhyun;Choi, Seokrim
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.9
    • /
    • pp.3672-3684
    • /
    • 2015
  • Recently, the standardization of backward compatible JPEG image coding for high dynamic range (HDR) image has been undertaken to establish an international standard called "JPEG XT." The JPEG XT consists of two layers: the base layer and the residual layer. The base layer contains tone mapped low dynamic range (LDR) image data and the residual layer contains the error signal used to reconstruct the HDR image. This paper gives the result of a study to evaluate the overall performance of tone mapping operators (TMOs) for this standard. The evaluation is performed using five HDR image datasets and six TMOs for profiles A, B, and C of the proposed JPEG XT standard. The Tone Mapped image Quality Index (TMQI) and no reference image quality assessment (NR IQA) are used for measuring the LDR image quality. The peak signal to noise ratio (PSNR) is used to evaluate the overall compression performance of JPEG XT profiles A, B, and C. In TMQI and NR IQA measurements, TMOs using display adaptive tone mapping and adaptive logarithmic mapping each gave good results. A TMO using adaptive logarithmic mapping gave good PSNRs.

An Application of ISODATA Method for Regional Lithological Mapping (광역지질도 작성을 위한 ISODATA 응용)

  • 朴鍾南;徐延熙
    • Korean Journal of Remote Sensing
    • /
    • v.5 no.2
    • /
    • pp.109-122
    • /
    • 1989
  • The ISODATA method, which is one of the most famous of the square-error clustering methos, has been applied to two Chungju multivariate data sets in order to evaluate the effectiveness of the regional lithological mapping. One is an airborne radiometric data set and the other is a mixed data set of the airborne radiometric and Landsat TM data. In both cases, the classification of the Bulguksa granite and the Kyemyongsan biotite-quartz gneiss are the most successful. Hyangsanni dolomitic limestone and neighboring Daehyangsan quartzite are also classified by their typical lowness of the radioactive intensities, though it is still confused with some others such as water-covered areas and nearby alluvials, and unaltered limestone areas. Topographically rugged valleys are also classified as the same cluster as above. This could be due to unavoidable variations of flight height and the attitude of the airborne system in such rugged terrains. The regional geological mapping of sedimentary rock units of the Ockchun System is in general confused. This might be due to similarities between different sediments. Considarable discrepancies occurred in mapping some lithological boundaries might also be due to secondary effects such as contamination or smoothing in digitizing process. Further study should be continued in the variable selection scheme as no absolutely superior method claims to exist yet since it seems somewhat to be rather data dependent. Study could also be made on the data preprocessing in order to reduce the erratic effects as mentioned above, and thus hoprfully draw much better result in regional geological mapping.

Low-complexity de-mapping algorithms for 64-APSK signals

  • Bao, Junwei;Xu, Dazhuan;Zhang, Xiaofei;Luo, Hao
    • ETRI Journal
    • /
    • v.41 no.3
    • /
    • pp.308-315
    • /
    • 2019
  • Due to its high spectrum efficiency, 64-amplitude phase-shift keying (64-APSK) is one of the primary technologies used in deep space communications and digital video broadcasting through satellite-second generation. However, 64-APSK suffers from considerable computational complexity because of the de-mapping method that it employs. In this study, a low-complexity de-mapping method for (4 + 12 + 20 + 28) 64-APSK is proposed in which we take full advantage of the symmetric characteristics of each symbol mapping. Moreover, we map the detected symbol to the first quadrant and then divide the region in this first quadrant into several partitions to simplify the formula. Theoretical analysis shows that the proposed method requires no operation of exponents and logarithms and involves only multiplication, addition, subtraction, and judgment. Simulation results validate that the time consumption is dramatically decreased with limited degradation of bit error rate performance.

Mapping algorithm for Error Compensation of Indoor Localization System (실내 측위 시스템의 오차 보정을 위한 매핑 알고리즘)

  • Kim, Tae-Kyum;Cho, We-Duke
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.47 no.4
    • /
    • pp.109-117
    • /
    • 2010
  • With the advent of new technologies such as HSDPA, WiBro(Wireless Broadband) and personal devices, we can access various contents and services anytime and anywhere. A location based service(LBS) is essential for providing personalized services with individual location information in ubiquitous computing environment. In this paper, we propose mapping algorithm for error compensation of indoor localization system. Also we explain filter and indoor localization system. we have developed mapping algorithms composed of a map recognition method and a position compensation method. The map recognition method achieves physical space recognition and map element relation extraction. We improved the accuracy of position searching. In addition, we reduced position errors using a dynamic scale factor.