• Title/Summary/Keyword: 이용가능 시간과 불가능시간

Search Result 205, Processing Time 0.028 seconds

Pulse Broadening and Intersymbol Interference of the Optical Gaussian Pulse Due to Atmospheric Turbulence in an Optical Wireless Communication System (광 무선통신시스템에서 대기 교란으로 인한 광 가우시안 펄스의 펄스 퍼짐과 부호 간 간섭에 관한 연구)

  • Jung, Jin-Ho
    • Korean Journal of Optics and Photonics
    • /
    • v.16 no.5
    • /
    • pp.417-422
    • /
    • 2005
  • When an optical pulse propagates through the atmospheric channel, it is attenuated and spreaded by the atmospheric turbulence. This pulse broadening produces the intersymbol interference(ISI) between adjacent pulses. Therefore, adjacent pulses are overlapped, and the bit rates and the repeaterless transmission length are limited by the ISI. In this paper, the ISI as a function of the refractive index structure constant that presents the strength of atmospheric turbulence is found using the temporal momentum function, and is numerically analyzed fer the basic SONET transmission rates. The numerical results show that ISI is gradually increasing at the lower transmission rate than the OC-192(9.953 Gb/s) system and is slowly converging after rapid increasing at the higher transmission rate than the OC-768(39.813 Gb/s) system as the turbulence is stronger. Also, we know that accurate information transmission is possible to 10[km] at the OC-48(2.488 Gb/s) system under any atmospheric turbulence, but is impossible under the stronger turbulence than $10^{-14}[m^{-2/3}]$ at the 100 Gb/s system, $10^{-13}[m^{-2/3}]$ at the OC-768 system, and $10^{-12}[m^{-2/3}]$ at the OC-192 system, because the ISI is seriously induced.

Comparison of Isolation Agar Method, Real-Time PCR and Loop-Mediated Isothermal Amplification-Bioluminescence for the Detection of Salmonella Typhimurium in Mousse Cake and Tiramisu (Mousse cake와 Tiramisu에 인위접종된 Salmonella Typhimurium의 식품공전 분리배지, Real-time PCR과 Loop-mediated isothermal amplification-bioluminescence의 검출 특성 비교)

  • Lee, So-Young;Gwak, Seung-Hae;Kim, Jin-Hee;Oh, Se-Wook
    • Journal of Food Hygiene and Safety
    • /
    • v.34 no.3
    • /
    • pp.290-295
    • /
    • 2019
  • Salmonella spp. are frequently associated with food and are among the most important foodborne pathogens. The recent Salmonella out breaks in Korea was associated with chocolate mousse cakes served with school meals during September 2018. The objective of this research was to compare the 3M Molecular Detection Assay 2 - Salmonella and the Korean Standard Method of Salmonella in artificially inoculated mousse (chocolate and cheese) and tiramisu cakes. Mousse (chocolate and cheese) and tiramisu cakes were artificially inoculated with S. Typhimurium. Twenty five gram of sample was enriched with 225 mL buffered peptone water for incubation at $37^{\circ}C$ for 24 h. After enrichment, the cultures were analyzed by using the 3M Molecular Detection Assay 2 - Salmonella and the Korean Standard Method. Most of the inoculated samples showed similar results except the chocolate mousse cakes, in which real-time PCR was unable to detect S. Typhimurium even after $10^4CFU/25g$ of inoculation. However, S. Typhimurium inoculated at a concentration of $10^0CFU/25g$ was detected by using 3M Molecular Detection Assay 2 - Salmonella. In chocolate mousse, detection of S. Typhimurium using real-time PCR was partially successful when dark chocolate was added at less than 15%. Negative results in real-time PCR and 3M Molecular Detection Assay 2 - Salmonella were confirmed by gel electrophoresis. The data indicated that dark chocolate could inhibit amplification of the target gene in the PCR reactions. In conclusion, the 3M Molecular Detection Assay 2 - Salmonella was better than the Korean Standard Method (real-time PCR) for the detection of S. Typhimurium in chocolate mousse cakes and chocolate mousse.

Development of Incident Detection Algorithm using GPS Data (GPS 정보를 활용한 돌발상황 검지 알고리즘 개발)

  • Kong, Yong-Hyuk;Kim, Hey-Jin;Yi, Yong-Ju;Kang, Sin-Jun
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.16 no.4
    • /
    • pp.771-782
    • /
    • 2021
  • Regular or irregular situations such as traffic accidents, damage to road facilities, maintenance or repair work, and vehicle breakdowns occur frequently on highways. It is required to provide traffic services to drivers by promptly recognizing these regular or irregular situations, various techniques have been developed for rapidly collecting data and detecting abnormal traffic conditions to solve the problem. We propose a method that can be used for verification and demonstration of unexpected situation algorithms by establishing a system and developing algorithms for detecting unexpected situations on highways. For the detection of emergencies on expressways, a system was established by defining the expressway contingency and algorithm development, and a test bed was operated to suggest a method that can be used for verification and demonstration of contingency algorithms. In this study, a system was established by defining the unexpected situation and developing an algorithm to detect the unexpected situation on the highway, and a method that can be used verifying and demonstrating unexpected situations. It is expected to secure golden time for the injured by reducing the effectiveness of secondary accidents. Also predictable accidents can be reduced in case of unexpected situations and the detection time of unpredictable accidents.

An Experimental Analysis of Ultrasonic Cavitation Effect on Ondol Pipeline Management (온돌 파이프라인 관리를 위한 초음파 캐비테이션 효과에 대한 실험적 분석)

  • Lee, Ung-Kyun
    • Journal of the Korea Institute of Building Construction
    • /
    • v.24 no.1
    • /
    • pp.67-75
    • /
    • 2024
  • In the context of Korean residential heating systems, Ondol pipelines are a prevalent choice. However, the maintenance of these pipelines becomes a complex task once they are embedded within concrete structures. As time progresses, the accumulation of sludge, corrosive oxides, and microorganisms on the inner surfaces of these pipelines diminishes their heating efficiency. In extreme scenarios, this accumulation can induce corrosion and scale formation, compromising the system's integrity. Consequently, this research introduces an ultrasonic generation system tailored for the upkeep of Ondol pipelines, with the objective of empirically assessing its practicality. This investigation delineates three variants of ultrasonic generating apparatuses: those employing surface vibration, external generation, and internal generation techniques. To emulate the presence of contaminants within the pipelines, substances in powder, slurry, and liquid forms were employed. The efficacy of the cleaning process post-ultrasonic wave application was scrutinized over time, with image analysis methodologies being utilized to evaluate the outcomes. The findings indicate that ultrasonic waves, whether generated externally or internally, exert a beneficial effect on the cleanliness of the pipelines. Given the inherent characteristics of Ondol pipelines, external generation proves impractical, thereby rendering internal generation a more viable solution for pipeline maintenance. It is anticipated that future endeavors will pave the way for innovative maintenance strategies for Ondol pipelines, particularly through the advancement of internal generation technologies for pipeline applications.

Comparison of Imported Wheat Flour Bread Making Properties and Korean Wheat Flour Bread Making Properties Made by Various Bread Making Methods (수입밀의 제빵 적성과 반죽법을 달리한 우리밀 제빵 적성의 비교)

  • Kim, Won-Mo;Lee, Gyu-Hee
    • Journal of the Korean Society of Food Science and Nutrition
    • /
    • v.44 no.3
    • /
    • pp.434-441
    • /
    • 2015
  • To develop dough methods for improving bread making properties of Korean wheat flour, straight dough methods (SDM) and dough and sponge methods (DSM) were applied. The bread making properties such as weight of bread, specific volume, baking loss, crumb color, and texture were analyzed. In comparison of flour properties between Korean wheat flour and imported wheat flour by Farinograph, Korean wheat flour showed less gluten network form ability than imported wheat flour. The dough making method affected bread quality such as weight of bread, specific volume, and baking loss. SDM had a more desirable effect on bread quality. Crumb color was lighter in bread made with Korean wheat flour compared to imported wheat flour, whereas dough making method did not affect crumb color. In consumer acceptance analysis, bread made by DSM showed higher consumer acceptance than that made by SDM. Regarding physicochemical changes during storage, bread made by SDM using Korean wheat flour showed higher chewiness, brittleness, and hardness than that made by imported wheat flour. However, bread made by DSM showed similar chewiness as bread made by SDM using imported wheat flour. The bread making properties of bread made by DSM is improved versus that of bread made with Korean wheat flour.

The Extended Site Assessment Procedure Based on Knowledge of Biodegradability to Evaluate the Applicability of Intrinsic Remediation (자연내재복원기술(Intrinsic Remediation)적용을 위한 오염지역 평가과정 개발)

  • ;Robert M. Cowan
    • Journal of Korea Soil Environment Society
    • /
    • v.2 no.3
    • /
    • pp.3-21
    • /
    • 1997
  • The remediation of contamiated sites using currently available remediation technologies requires long term treatment and huge costs, and it is uncertain to achieve the remediation goal to drop contamination level to either back-ground or health-based standards by using such technologies. Intrinsic remediation technology is the remediation technology that relies on the mechanisms of natural attenuation for the containment and elimination of contaminants in subsurface environments. Initial costs for the intrinsic remediation may be higher than conventional treatment technologies because the most comprehensive site assessment for intrinsic remediation is required. Total remediation cost, however may be the lowest among the presently employed technologies. The applicability of intrinsic remediation in the contaminated sites should be theroughly investigated to achieve the remedial goal of the technology. This paper provides the frame of the extended site assessment procedure based on knowledge of biodegradability to evaluate the applicability of intrinsic remediation. This site assessment procedure is composed of 5 steps such as preliminary site screening, assessment of the current knowledge of biodegradability, selecting the appropriate approach, analyzing the contaminant fate and transport and planning the monitoring schedule. In the step 1, followings are to be decided 1) whether to go on the the detailed assessment or not based on the rules of thumb concerning the biodegradability of organic compounds, 2) which protocol document is selected to follow for detailed site assessment according to the site characteristics, contaminants and the relative distance between the contamination and potential receptors. In the step 2, the database for biodegradability are searched and evaluated. In the step 3, the appropriate biodegradability pathways for the contaminated site is selected. In the step 4, the fate and transport of the contaminants at the site are analyzed through modeling. In the step 5, the monitoring schedule is planned according to the result of the modeling. Through this procedure, users may able to have the rational and systematic informations for the application of intrinsic remediation. Also the collected data and informations can be used as the basic to re-select the other remediation technology if it reaches a conclusion not to applicate intrinsic remediation technology at the site from the site assessment procedure.

  • PDF

A Study on Damage factor Analysis of Slope Anchor based on 3D Numerical Model Combining UAS Image and Terrestrial LiDAR (UAS 영상 및 지상 LiDAR 조합한 3D 수치모형 기반 비탈면 앵커의 손상인자 분석에 관한 연구)

  • Lee, Chul-Hee;Lee, Jong-Hyun;Kim, Dal-Joo;Kang, Joon-Oh;Kwon, Young-Hun
    • Journal of the Korean Geotechnical Society
    • /
    • v.38 no.7
    • /
    • pp.5-24
    • /
    • 2022
  • The current performance evaluation of slope anchors qualitatively determines the physical bonding between the anchor head and ground as well as cracks or breakage of the anchor head. However, such performance evaluation does not measure these primary factors quantitatively. Therefore, the time-dependent management of the anchors is almost impossible. This study is an evaluation of the 3D numerical model by SfM which combines UAS images with terrestrial LiDAR to collect numerical data on the damage factors. It also utilizes the data for the quantitative maintenance of the anchor system once it is installed on slopes. The UAS 3D model, which often shows relatively low precision in the z-coordinate for vertical objects such as slopes, is combined with terrestrial LiDAR scan data to improve the accuracy of the z-coordinate measurement. After validating the system, a field test is conducted with ten anchors installed on a slope with arbitrarily damaged heads. The damages (such as cracks, breakages, and rotational displacements) are detected and numerically evaluated through the orthogonal projection of the measurement system. The results show that the introduced system at the resolution of 8K can detect cracks less than 0.3 mm in any aperture with an error range of 0.05 mm. Also, the system can successfully detect the volume of the damaged part, showing that the maximum damage area of the anchor head was within 3% of the original design guideline. Originally, the ground adhesion to the anchor head, where the z-coordinate is highly relevant, was almost impossible to measure with the UAS 3D numerical model alone because of its blind spots. However, by applying the combined system, elevation differences between the anchor bottom and the irregular ground surface was identified so that the average value at 20 various locations was calculated for the ground adhesion. Additionally, rotation angle and displacement of the anchor head less than 1" were detected. From the observations, the validity of the 3D numerical model can obtain quantitative data on anchor damage. Such data collection can potentially create a database that could be used as a fundamental resource for quantitative anchor damage evaluation in the future.

Establishment of Featal Heart Surgery with an Improvement of the Placental Blood Flow in Cardiopulmonary Bypass Using Fetal Lamb Model (양태아를 이용한 심폐우회술에서의 태반혈류개선을 통한 태아심장수술의 기반기술 확립)

  • 이정렬;박천수;임홍국;배은정;안규리
    • Journal of Chest Surgery
    • /
    • v.37 no.1
    • /
    • pp.11-18
    • /
    • 2004
  • Background: We tested the effect of indomethacine and total spinal anesthesia on the improvement of placental flow during cardiopulmonary bypass on fetal lamb. Material and Method: Twenty fetuses at 120 to 150 days of gestation were subjected to bypass via trans-sternal approach with a 12 G pulmonary arterial cannula and 14 to 18 F venous cannula for 30 minutes. All ewes received general anesthesia with ketamine. In all the fetuses, no anesthetic agents were used except muscle relaxant. Ten served as a control group in which placenta was worked as an oxygenator during bypass (Control group). The remainder worked as an experimental group in which pretreatment with indomethacine and total spinal anesthesia was performed before bypass with the same extracorporeal circulation technique as control group (Experimental group). Observations were made every 10 minutes during a 30-minute bypass and 30-minute post bypass period. Result: Weights of the fetuses ranged from 2.2 to 5.2 kg. In Control group, means of arterial pressure decreased from 44.7 to 14.4 mmHg and means of Pa$CO_2$ increased from 61.9 to 129.6 mmHg at each time points during bypass. Flow rate was suboptimal (74.3 to 97.0 $m\ell$/kg/min) during bypass. All hearts fibrillated immediately after the discontinuation of bypass. On the contrary, in Experimental group, means of arterial pressure reamined higher (45.8 to 30 mmHg) during bypass (p<0.05). Means of Pa$CO_2$ were less ranging from 59.8 to 79.4 mmHg during bypass (P<0.05). Flow rates were higher (78.8 to 120.2 $m\ell$/kg/min) during bypass (p<0.05). There were slower deterioration of cardiac function after cessation of bypass. Conclusion: In this study, we demonstrated that the placental flow was increased during fetal cardiopulmonary bypass in the group pretreated with indomethacine and total spinal anesthesia. However, further studies with modifications of the bypass including a creation of more concise bypass circuit, and a use of axial pump are mandatory for the clinical application.

An Iterative, Interactive and Unified Seismic Velocity Analysis (반복적 대화식 통합 탄성파 속도분석)

  • Suh Sayng-Yong;Chung Bu-Heung;Jang Seong-Hyung
    • Geophysics and Geophysical Exploration
    • /
    • v.2 no.1
    • /
    • pp.26-32
    • /
    • 1999
  • Among the various seismic data processing sequences, the velocity analysis is the most time consuming and man-hour intensive processing steps. For the production seismic data processing, a good velocity analysis tool as well as the high performance computer is required. The tool must give fast and accurate velocity analysis. There are two different approches in the velocity analysis, batch and interactive. In the batch processing, a velocity plot is made at every analysis point. Generally, the plot consisted of a semblance contour, super gather, and a stack pannel. The interpreter chooses the velocity function by analyzing the velocity plot. The technique is highly dependent on the interpreters skill and requires human efforts. As the high speed graphic workstations are becoming more popular, various interactive velocity analysis programs are developed. Although, the programs enabled faster picking of the velocity nodes using mouse, the main improvement of these programs is simply the replacement of the paper plot by the graphic screen. The velocity spectrum is highly sensitive to the presence of the noise, especially the coherent noise often found in the shallow region of the marine seismic data. For the accurate velocity analysis, these noise must be removed before the spectrum is computed. Also, the velocity analysis must be carried out by carefully choosing the location of the analysis point and accuarate computation of the spectrum. The analyzed velocity function must be verified by the mute and stack, and the sequence must be repeated most time. Therefore an iterative, interactive, and unified velocity analysis tool is highly required. An interactive velocity analysis program, xva(X-Window based Velocity Analysis) was invented. The program handles all processes required in the velocity analysis such as composing the super gather, computing the velocity spectrum, NMO correction, mute, and stack. Most of the parameter changes give the final stack via a few mouse clicks thereby enabling the iterative and interactive processing. A simple trace indexing scheme is introduced and a program to nike the index of the Geobit seismic disk file was invented. The index is used to reference the original input, i.e., CDP sort, directly A transformation techinique of the mute function between the T-X domain and NMOC domain is introduced and adopted to the program. The result of the transform is simliar to the remove-NMO technique in suppressing the shallow noise such as direct wave and refracted wave. However, it has two improvements, i.e., no interpolation error and very high speed computing time. By the introduction of the technique, the mute times can be easily designed from the NMOC domain and applied to the super gather in the T-X domain, thereby producing more accurate velocity spectrum interactively. The xva program consists of 28 files, 12,029 lines, 34,990 words and 304,073 characters. The program references Geobit utility libraries and can be installed under Geobit preinstalled environment. The program runs on X-Window/Motif environment. The program menu is designed according to the Motif style guide. A brief usage of the program has been discussed. The program allows fast and accurate seismic velocity analysis, which is necessary computing the AVO (Amplitude Versus Offset) based DHI (Direct Hydrocarn Indicator), and making the high quality seismic sections.

  • PDF

Risk Aversion in Forward Foreign Currency Markets (선도환시장(先渡換市場)에서의 위험회피도(危險回避度)에 관한 연구(硏究))

  • Jang, Ik-Hwan
    • The Korean Journal of Financial Management
    • /
    • v.8 no.1
    • /
    • pp.179-197
    • /
    • 1991
  • 선도환의 가격을 결정하는 접근방법에는 2차자산(derivative assets)이라는 선도계약의 기본특성에 기초한 재정거래(arbitrage)에 의한 방법이 가장 많이 이용되고 있다. 재정거래방식에는 선도환과 현물외환가격간의 상호관련성에 의하여 선도환가격을 이자율평가설(covered interest rate parity : CIRP), 즉 현물가격과 양국간의 이자율차이의 합으로 표시하고 있다. 특히 현물가격과 이자율은 모두 현재시점에서 의사결정자에게 알려져 있기때문에 선도환가격은 확실성하에서 결정되어 미래에 대한 예측이나 투자자의 위험회피도와는 관계없이 결정된다는 것이 특징이다. 이자율평가설에 관한 많은 실증연구는 거래 비용을 고려한 경우 현실적으로 적절하다고 보고 있다(Frenkel and Levich ; 1975, 1977). 다른 방법으로는 선도환의 미래예측기능에만 촛점을 맞추어 가격결정을 하는 투기, 예측접근방법(speculative efficiency approach : 이하에서는 SEA라 함)이 있다. 이 방법 중에서 가장 단순한 형태로 표시된 가설, 즉 '선도환가격은 미래기대현물가격과 같다'는 가설은 대부분의 실증분석에서 기각되고 있다. 이에 따라 SEA에서는 선도환가격이 미래에 대한 기대치뿐만 아니라 위험프리미엄까지 함께 포함하고 있다는 새로운 가설을 설정하고 이에 대한 실증분석을 진행한다. 이 가설은 이론적 모형에서 출발한 것이 아니기 때문에, 특히 기대치와 위험프레미엄 모두가 측정 불가능하다는 점으로 인하여 실증분석상 많은 어려움을 겪게 된다. 이러한 어려움을 피하기 위하여 많은 연구에서는 이자율평가설을 이용하여 선도환가격에 포함된 위험프레미엄에 대해 추론 내지 그 행태를 설명하려고 한다. 이자율평가설을 이용하여 분석모형을 설정하고 실증분석을 하는 것은 몇가지 근본적인 문제점을 내포하고 있다. 먼저, 앞서 지적한 바와 같이 이자율평가설을 가정한다는 것은 SEA에서 주된 관심이 되는 미래예측이나 위험프레미엄과는 관계없이 선도가격이 결정 된다는 것을 의미한다. 따라서 이자율평가설을 가정하여 설정된 분석모형은 선도환시장의 효율성이나 균형가격결정에 대한 시사점을 제공할 수 없다는 것을 의미한다. 즉, 가정한 시장효율성을 실증분석을 통하여 다시 검증하려는 것과 같다. 이러한 개념적 차원에서의 문제점 이외에도 실증분석에서의 추정상의 문제점 또한 존재한다. 대부분의 연구들이 현물자산의 균형가격결정모형에 이자율평가설을 추가로 결합하기 때문에 이러한 방법으로 설정한 분석모형은 그 기초가 되는 현물가격모형과는 달리 자의적 조작이 가능한 형태로 나타나며 이를 이용한 모수의 추정은 불필요한 편기(bias)를 가지게 된다. 본 연구에서는 이러한 실증분석상의 편기에 관한 문제점이 명확하고 구체적으로 나타나는 Mark(1985)의 실증연구를 재분석하고 실증자료를 통하여 위험회피도의 추정치에 편기가 발생하는 근본원인이 이자율평가설을 부적절하게 사용하는데 있다는 것을 확인 하고자 한다. 실증분석결과는 본문의 <표 1>에 제시되어 있으며 그 내용을 간략하게 요약하면 다음과 같다. (A) 실증분석모형 : 본 연구에서는 다기간 자산가격결정모형중에서 대표적인 Lucas (1978)모형을 직접 사용한다. $$1={\beta}\;E_t[\frac{U'(C_{t+1})\;P_t\;s_{t+1}}{U'(C_t)\;P_{t+1}\;s_t}]$$ (2) $U'(c_t)$$P_t$는 t시점에서의 소비에 대한 한계효용과 소비재의 가격을, $s_t$$f_t$는 외환의 현물과 선도가격을, $E_t$${\beta}$는 조건부 기대치와 시간할인계수를 나타낸다. Mark는 위의 식 (2)를 이자율평가설과 결합한 다음의 모형 (4)를 사용한다. $$0=E_t[\frac{U'(C_{t+1})\;P_t\;(s_{t+1}-f_t)}{U'(C_t)\;P_{t+1}\;s_t}]$$ (4) (B) 실증분석의 결과 위험회피계수 ${\gamma}$의 추정치 : Mark의 경우에는 ${\gamma}$의 추정치의 값이 0에서 50.38까지 매우 큰 폭의 변화를 보이고 있다. 특히 비내구성제품의 소비량과 선도프레미엄을 사용한 경우 ${\gamma}$의 추정치의 값은 17.51로 비정상적으로 높게 나타난다. 반면에 본 연구에서는 추정치가 1.3으로 주식시장자료를 사용한 다른 연구결과와 비슷한 수준이다. ${\gamma}$추정치의 정확도 : Mark에서는 추정치의 표준오차가 최소 15.65에서 최대 42.43으로 매우 높은 반면 본 연구에서는 0.3에서 0.5수준으로 상대적으로 매우 정확한 추정 결과를 보여주고 있다. 모형의 정확도 : 모형 (4)에 대한 적합도 검증은 시용된 도구변수(instrumental variables)의 종류에 따라 크게 차이가 난다. 시차변수(lagged variables)를 사용하지 않고 현재소비와 선도프레미엄만을 사용할 경우 모형 (4)는 2.8% 또는 2.3% 유의수준에서 기각되는 반면 모형 (2)는 5% 유의수준에서 기각되지 않는다. 위와같은 실증분석의 결과는 앞서 논의한 바와 같이 이자율평가설을 사용하여 균형자산가격 결정모형을 변형시킴으로써 불필요한 편기를 발생시킨다는 것을 명확하게 보여주는 것이다.

  • PDF