• Title/Summary/Keyword: filter(normal filter)

Search Result 353, Processing Time 0.027 seconds

Gaussian noise addition approaches for ensemble optimal interpolation implementation in a distributed hydrological model

  • Manoj Khaniya;Yasuto Tachikawa;Kodai Yamamoto;Takahiro Sayama;Sunmin Kim
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2023.05a
    • /
    • pp.25-25
    • /
    • 2023
  • The ensemble optimal interpolation (EnOI) scheme is a sub-optimal alternative to the ensemble Kalman filter (EnKF) with a reduced computational demand making it potentially more suitable for operational applications. Since only one model is integrated forward instead of an ensemble of model realizations, online estimation of the background error covariance matrix is not possible in the EnOI scheme. In this study, we investigate two Gaussian noise based ensemble generation strategies to produce dynamic covariance matrices for assimilation of water level observations into a distributed hydrological model. In the first approach, spatially correlated noise, sampled from a normal distribution with a fixed fractional error parameter (which controls its standard deviation), is added to the model forecast state vector to prepare the ensembles. In the second method, we use an adaptive error estimation technique based on the innovation diagnostics to estimate this error parameter within the assimilation framework. The results from a real and a set of synthetic experiments indicate that the EnOI scheme can provide better results when an optimal EnKF is not identified, but performs worse than the ensemble filter when the true error characteristics are known. Furthermore, while the adaptive approach is able to reduce the sensitivity to the fractional error parameter affecting the first (non-adaptive) approach, results are usually worse at ungauged locations with the former.

  • PDF

Development of Radiation Thermometer using InSb Photo-detector (인듐안티모나이드(InSb) 소자를 이용한 적외선 방사온도 계측시스템의 개발연구)

  • Hwang, Byeong-Oc;Lee, Won-Sik;Jhang, Kyung-Young
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.12 no.7
    • /
    • pp.46-52
    • /
    • 1995
  • This paper proposes methodologies for the development of radiation thermometer using InSb photo-detector of which spectral sensitivity is excellent over the wave length range of 2 .mu. m .approx. 5 .mu. m. The proposed radiation thermometer has broad measurement range from normal to high, up to more than 1000 .deg. C, with high accuracy, and can measure temperature on the material surface or heat emission noncontactely with high speed. Optical system was consisted of two convex lens with foruslength of 15.2mm for infrared lay focusing, Ge filter to cut the short wave length components and sapphire filter to cut the long wave length components. The cold shielded was installed in the whole surface of the light-absorbing element to remove the error- mometer, calibration using black body furnace which has temperature range of 90 .deg. C .approx. 1100 .deg. C was carried out, and temperature calaibration curve was obtained by exponential function curvefitting. The result shows maximum error less than 0.24%(640K .+-. 1.6K) over the measurement range of 90 .deg. C .approx. 700 .deg. C, and from this result the usefulness of the developed thermometer has been confirmed.

  • PDF

Advances in the use of dried blood spots on filter paper to monitor kidney disease

  • Carla Nicola;Vandrea de Souza
    • Childhood Kidney Diseases
    • /
    • v.28 no.1
    • /
    • pp.16-26
    • /
    • 2024
  • Patients with kidney disease require frequent blood tests to monitor their kidney function, which is particularly difficult for young children and the elderly. For these people, the standard method is to evaluate serum creatinine or cystatin C or drug levels through venous sampling, but more recently, evaluation using dried blood spots has been used. This narrative review reports information from the literature on the use of dried blood spots to quantify the main markers used to detect kidney diseases. The ScienceDirect and PubMed databases were searched using the keywords: "dried blood on filter paper," "markers of renal function," "renal function," "creatinine," "cystatin C," "urea," "iohexol," and "iotalamate." Studies using animal samples were excluded, and only relevant articles in English or Spanish were considered. Creatinine was the most assessed biomarker in studies using dried blood spots to monitor kidney function, showing good performance in samples whose hematocrit levels were within normal reference values. According to the included studies, dried blood spots are a practical monitoring alternative for kidney disease. Validation parameters, such as sample and card type, volume, storage, internal patterns, and the effects of hematocrit are crucial to improving the reliability of these results.

A Study on Possibility of Detection of Insulators' Faults by Analyses of Radiation Noises from Insulators (애자의 소음 분석을 통한 애자 고장 탐지 가능성 연구)

  • Park, Kyu-Chil;Yoon, Jong-Rak;Lee, Jae-Hun
    • The Journal of the Acoustical Society of Korea
    • /
    • v.28 no.8
    • /
    • pp.822-831
    • /
    • 2009
  • The porcelain insulators are important devices, that are used to isolate electrically and hold mechanically in the high-voltage power transmission systems. The faults of the insulators induce very serious problems to the power transmission line. In this paper, we introduce techniques for fault detections of insulators by acoustic radiation noises from them. We measured radiation noises from normal state insulators and fault state insulators. The used insulators were two different type porcelain insulators, a cut out switch, two different type line posters, and a lightning arrester. Each results was compared each other in time domain, frequency domain and filter banks' outputs. We found the possibility of detection of insulators' faults and also suggested techniques for fault detections.

LiDAR Static Obstacle Map based Position Correction Algorithm for Urban Autonomous Driving (도심 자율주행을 위한 라이다 정지 장애물 지도 기반 위치 보정 알고리즘)

  • Noh, Hanseok;Lee, Hyunsung;Yi, Kyongsu
    • Journal of Auto-vehicle Safety Association
    • /
    • v.14 no.2
    • /
    • pp.39-44
    • /
    • 2022
  • This paper presents LiDAR static obstacle map based vehicle position correction algorithm for urban autonomous driving. Real Time Kinematic (RTK) GPS is commonly used in highway automated vehicle systems. For urban automated vehicle systems, RTK GPS have some trouble in shaded area. Therefore, this paper represents a method to estimate the position of the host vehicle using AVM camera, front camera, LiDAR and low-cost GPS based on Extended Kalman Filter (EKF). Static obstacle map (STOM) is constructed only with static object based on Bayesian rule. To run the algorithm, HD map and Static obstacle reference map (STORM) must be prepared in advance. STORM is constructed by accumulating and voxelizing the static obstacle map (STOM). The algorithm consists of three main process. The first process is to acquire sensor data from low-cost GPS, AVM camera, front camera, and LiDAR. Second, low-cost GPS data is used to define initial point. Third, AVM camera, front camera, LiDAR point cloud matching to HD map and STORM is conducted using Normal Distribution Transformation (NDT) method. Third, position of the host vehicle position is corrected based on the Extended Kalman Filter (EKF).The proposed algorithm is implemented in the Linux Robot Operating System (ROS) environment and showed better performance than only lane-detection algorithm. It is expected to be more robust and accurate than raw lidar point cloud matching algorithm in autonomous driving.

Mobile Robot Localization in Geometrically Similar Environment Combining Wi-Fi with Laser SLAM

  • Gengyu Ge;Junke Li;Zhong Qin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.17 no.5
    • /
    • pp.1339-1355
    • /
    • 2023
  • Localization is a hot research spot for many areas, especially in the mobile robot field. Due to the weak signal of the global positioning system (GPS), the alternative schemes in an indoor environment include wireless signal transmitting and receiving solutions, laser rangefinder to build a map followed by a re-localization stage and visual positioning methods, etc. Among all wireless signal positioning techniques, Wi-Fi is the most common one. Wi-Fi access points are installed in most indoor areas of human activities, and smart devices equipped with Wi-Fi modules can be seen everywhere. However, the localization of a mobile robot using a Wi-Fi scheme usually lacks orientation information. Besides, the distance error is large because of indoor signal interference. Another research direction that mainly refers to laser sensors is to actively detect the environment and achieve positioning. An occupancy grid map is built by using the simultaneous localization and mapping (SLAM) method when the mobile robot enters the indoor environment for the first time. When the robot enters the environment again, it can localize itself according to the known map. Nevertheless, this scheme only works effectively based on the prerequisite that those areas have salient geometrical features. If the areas have similar scanning structures, such as a long corridor or similar rooms, the traditional methods always fail. To address the weakness of the above two methods, this work proposes a coarse-to-fine paradigm and an improved localization algorithm that utilizes Wi-Fi to assist the robot localization in a geometrically similar environment. Firstly, a grid map is built by using laser SLAM. Secondly, a fingerprint database is built in the offline phase. Then, the RSSI values are achieved in the localization stage to get a coarse localization. Finally, an improved particle filter method based on the Wi-Fi signal values is proposed to realize a fine localization. Experimental results show that our approach is effective and robust for both global localization and the kidnapped robot problem. The localization success rate reaches 97.33%, while the traditional method always fails.

Application of CSP Filter to Differentiate EEG Output with Variation of Muscle Activity in the Left and Right Arms (좌우 양팔의 근육 활성도 변화에 따른 EEG 출력 구분을 위한 CSP 필터의 적용)

  • Kang, Byung-Jun;Jeon, Bu-Il;Cho, Hyun-Chan
    • Journal of IKEEE
    • /
    • v.24 no.2
    • /
    • pp.654-660
    • /
    • 2020
  • Through the output of brain waves during muscle operation, this paper checks whether it is possible to find characteristic vectors of brain waves that are capable of dividing left and right movements by extracting brain waves in specific areas of muscle signal output that include the motion of the left and right muscles or the will of the user within EEG signals, where uncertainties exist considerably. A typical surface EMG and noninvasive brain wave extraction method does not exist to distinguish whether the signal is a motion through the degree of ionization by internal neurotransmitter and the magnitude of electrical conductivity. In the case of joint and motor control through normal robot control systems or electrical signals, signals that can be controlled by the transmission and feedback control of specific signals can be identified. However, the human body lacks evidence to find the exact protocols between the brain and the muscles. Therefore, in this paper, efficiency is verified by utilizing the results of application of CSP (Common Spatial Pattern) filter to verify that the left-hand and right-hand signals can be extracted through brainwave analysis when the subject's behavior is performed. In addition, we propose ways to obtain data through experimental design for verification, to verify the change in results with or without filter application, and to increase the accuracy of the classification.

Comparison of Digital Filters with Wavelet Multiresolution Filter for Electrogastrogram (위전도 신호처리를 위한 웨이브렌 필터와 디지털 필터의 비교)

  • 유창용;남기창;김수찬;김덕원
    • Journal of Biomedical Engineering Research
    • /
    • v.23 no.2
    • /
    • pp.109-117
    • /
    • 2002
  • Electrogastrography(EGG) is a noninvasive method for measuring gastric electrical activity on the abdomen resulting from gastric muscle. EGG signals have a very low frequency range (0.0083 ~0.15 Hz) and extremely low amplitude(10~100 uV). Consequently, EGG signal is easily influenced by other noises. Both finite impulse response(FIR) and infinite impulse response (IIR) filters need high orders or have phase distortions for passing very narrow bandwidth of the EGG signal. In this study, we decomposed EGG signals using a wavelet multiresolution method with Daubechies mother wavelet. The EGG signals were decomposed to seven levels. We reconstructed signal by summing the decomposed signals from level four to seven. To evaluate the performance of the wavelet multiresolution filter(WMF) with simulated EGG signal using two kinds of FIR and four kinds of IIR filters., we used two indices; signal to noise ratio(SNR) and reconstruction squared error(RSE). The SNR of WMF had 9.5, 6.9, and 4.7 dB bigger than that of the other filters at different noise levels, respectively. Also, The RSE of WMF had $1.22{\times}10^6, 1.16{\times}10^6, 1.02{\times}10^6$ smaller than that of the other filters at different noise levels, respectively. The WMF performed better in the SNR and RSE than two kinds of FIR and four kinds of IIR filters.

A Study on the Performance of Companding Algorithms for Digital Hearing Aid Users (디지털 보청기 사용자를 위한 압신 알고리즘의 성능 연구)

  • Hwang, Y.S.;Han, J.H.;Ji, Y.S.;Hong, S.H.;Lee, S.M.;Kim, D.W.;Kim, In-Young;Kim, Sun-I.
    • Journal of Biomedical Engineering Research
    • /
    • v.32 no.3
    • /
    • pp.218-229
    • /
    • 2011
  • Companding algorithms have been used to enhance speech recognition in noise for cochlea implant users. The efficiency of using companding for digital hearing aid users is not yet validated. The purpose of this study is to evaluate the performance of the companding for digital hearing aid users in the various hearing loss cases. Using HeLPS, a hearing loss simulator, two different sensorinerual hearing loss conditions were simulated; mild gently sloping hearing loss(HL1) and moderate to steeply sloping hearing loss(HL2). In addition, a non-linear compression was simulated to compensate for hearing loss using national acoustic laboratories-non-linear version 1(NAL-NL1) in HeLPS. In companding, the following four different companding strategies were used changing Q values(q1, q2) of pre-filter(F filter) and post filter(G filter). Firstly, five IEEE sentences which were presented with speech-shaped noise at different SNRs(0, 5, 10, 15 dB) were processed by the companding. Secondly, the processed signals were applied to HeLPS. For comparison, signals which were not processed by companding were also applied to HeLPS. For the processed signals, log-likelihood ratio(LLR) and cepstral distance(CEP) were measured for evaluation of speech quality. Also, fourteen normal hearing listeners performed speech reception threshold(SRT) test for evaluation of speech intelligibility. As a result of this study, the processed signals with the companding and NAL-NL1 have performed better than that with only NAL-NL1 in the sensorineural hearing loss conditions. Moreover, the higher ratio of Q values showed better scores in LLR and CEP. In the SRT test, the processed signals with companding(SRT = -13.33 dB SPL) showed significantly better speech perception in noise than those processed using only NAL-NL1(SRT = -11.56 dB SPL).

Main Content Extraction from Web Pages Based on Node Characteristics

  • Liu, Qingtang;Shao, Mingbo;Wu, Linjing;Zhao, Gang;Fan, Guilin;Li, Jun
    • Journal of Computing Science and Engineering
    • /
    • v.11 no.2
    • /
    • pp.39-48
    • /
    • 2017
  • Main content extraction of web pages is widely used in search engines, web content aggregation and mobile Internet browsing. However, a mass of irrelevant information such as advertisement, irrelevant navigation and trash information is included in web pages. Such irrelevant information reduces the efficiency of web content processing in content-based applications. The purpose of this paper is to propose an automatic main content extraction method of web pages. In this method, we use two indicators to describe characteristics of web pages: text density and hyperlink density. According to continuous distribution of similar content on a page, we use an estimation algorithm to judge if a node is a content node or a noisy node based on characteristics of the node and neighboring nodes. This algorithm enables us to filter advertisement nodes and irrelevant navigation. Experimental results on 10 news websites revealed that our algorithm could achieve a 96.34% average acceptable rate.