• Title/Summary/Keyword: Bank layer

Search Result 75, Processing Time 0.027 seconds

A Study on the Correlation between the Harmful Cyanobacterial Density and Phycocyanin Concentration at Recreational Sites in Nakdong River (낙동강 친수활동구간 유해 남조류 분포와 피코시아닌(Phycocyanin) 농도 상관성에 관한 연구)

  • Hyo-Jin Kim;Min-Kyeong Kim
    • Journal of Korean Society on Water Environment
    • /
    • v.39 no.6
    • /
    • pp.451-464
    • /
    • 2023
  • Harmful cyanobacterial monitoring is time-consuming and requires skilled professionals. Recently, Phycocyanin, the accessory pigment unique to freshwater cyanobacteria, has been proposed as an indicator for the presence of cyanobacteria, with the advantage of rapid and simple measurement. The purpose of this research was to evaluate the correlation between the harmful cyanobacterial cell density and the concentration of phycocyanin and to consider how to use the real-time water quality monitoring system for algae bloom monitoring. In the downstream of the Nakdong River, Microcystis spp. showed maximum cell density (99 %) in harmful cyanobacteria (four target genera). A strong correlation between phycocyanin(measured in the laboratory) concentrations and harmful cyanobacterial cell density was observed (r = 0.90, p < 0.001), while a weaker relationship (r = 0.65, p < 0.001) resulted between chlorophyll a concentration and harmful cyanobacterial cell density. As a result of comparing the phycocyanin concentration (measured in submersible fluorescence sensor) and harmful cyanobacterial cell density, the error range increased as the number of cyanobacteria cells increased. Before opening the estuary bank, the diurnal variations of phycocyanin concentrations did not mix by depth, and in the case of the surface layer, a pattern of increase and decrease over time was shown. This study is the result of analysis when Microcystis spp. is dominant in downstream of Nakdong River in summer, therefore the correlation between the harmful cyanobacteria density and phycocyanin concentrations should be more generalized through spatio-temporal expansion.

Effect to the Copper System Pigments by the Nitrogen Dioxide(NO2) Gas (이산화질소(NO2)가 구리(Cu)계통 안료에 미치는 영향)

  • Kim, Ji Won;Lee, Hwa Soo;Lee, Han Hyeong;Kim, Myoung Nam;Kang, Dai Ill
    • Journal of Conservation Science
    • /
    • v.31 no.4
    • /
    • pp.403-409
    • /
    • 2015
  • Malachite and Azurite are the typical copper system pigments which used the mural paintings since ancient times. The mural painting is at risk for damages of the painting layer by atmosphere gas because it is exposed at external environment. In this study, it did experiment about an effect to Malachite and Azurite by environmental pollution gas($NO_2$, $CO_2$, $SO_2$) then analysis and estimate about test for pieces using mural painting colored that two pigments. As a result, Malachite and Azurite were changed on $NO_2$ but not changed $CO_2$ and $SO_2$. Especially as the concentration of $NO_2$ is increased, exfoliation of the pigment layer weave remarkably formed pores on the pigment particles on SEM, the phenomenon to be pieces were observed together with smaller particles. In the case of Malachite that were exposed to $NO_2$ gas, new compounds(Rouaite : dicopper (nitrate(V) trihydroxide, $Cu_2(NO_3)(OH)_3$)) was appeared by XRD analysis. Therefore, there had been able to verify the fact that the cause exfoliation and discoloration phenomena accompanied by chemical changes for Malachite and Azurite.

Control of an invasive alien species, Ambrosia trifida with restoration by introducing willows as a typical riparian vegetation

  • Lee, Chang-Seok;Cho, Yong-Chan;Shin, Hyun-Cheol;Kim, Gyung-Soon;Pi, Jeong-Hoon
    • Journal of Ecology and Environment
    • /
    • v.33 no.2
    • /
    • pp.157-164
    • /
    • 2010
  • We evaluated the restoration effect by introducing willows as a means of controlling invasions of giant ragweed (Ambrosia trifida L.) on a riparian site. Our preliminary survey demonstrated that a problematic exotic species, giant ragweed and the representative riparian species, Salix koreensis are in competitive exclusive relationship. We planted willows at 1 m intervals on the bank of the Dongmun stream at Munsan, Paju, in Central Western Korea as an experimental restoration practice. We installed two 50 m $\times$ 5 m sized restored and non-restored for this experimental study. The non-restored plots were located on river banks, which were covered with concrete blocks and left in itself without any treatment. The height of willow was measured after each of three consecutive growing seasons and compared with the height of the giant ragweed. Although the height of Salix gracilistyla did not achieve the height of the giant ragweed, the height of S. koreensis surpassed that of giant ragweed in the third year after introduction. The results were also reflected in the relative light intensity on the herb layer of willow stand, and thereby the relative light intensities of stands, which were dominated by S. koreensis or restored by introducing S. koreensis, 1.99 $\pm$ 0.33 (%, mean $\pm$ SD) and 1.92 $\pm$ 0.50 (%, mean $\pm$ SD), respectively were lower than those in the stands treated by S. gracilistyla, 3.01 $\pm$ 0.43 (%, mean $\pm$ SD). The giant ragweed stands receive full sunlight as there are no any vegetation layers higher than the herb layer formed by the giant ragweed. As the result of Detrended Correspondence analysis ordination based on naturally established vegetation, the stands dominated by willows and giant ragweed showed different species composition between both stands. The species composition of the restoratively treated sites resembled the reference sites more than the non-treated sites. The species diversity (H') of the sites restored by introducing S. koreensis and S. gracilistyla was higher than the non-restored site dominated by A. trifida. On the basis of our results, the restoration of riparian vegetation equipped with integrated features could contribute not only to the control of exotic plants including giant ragweed but also to ensure the diversity and stability of riverine ecosystems.

The Improved Antigen-binding Activity of Biosimilar Remicade ScFv Antibodies by Fusion of the Leucine Zipper Domain (Leucine zipper도메인의 융합에 의한 바이오시밀러 레미케이드 Single-chain Fv 항체의 항원 결합력 개선)

  • Kim, Jin-Kyoo;Kim, Tae Hwan
    • Journal of Life Science
    • /
    • v.30 no.11
    • /
    • pp.1012-1020
    • /
    • 2020
  • Remicade is a therapeutic biosimilar natural antibody in which the mouse variable domain has been linked to the human constant domain. It is a chimeric monoclonal antibody specific to tumor necrosis factor-alpha (TNF-α) and has been developed for the treatment of rheumatoid arthritis. To investigate the biological activity of the Remicade antibody, we carried out a bioinformatics study using a protein data bank to characterize the TNF-α antigen binding mechanism of the Remicade natural antibody. Because the production of the Remicade antibody is often limited by genetic instability of the natural antibody-producing cell, we generated a Remicade single-chain variable domain fragment antibody (Remicade) in which a heavy chain variable domain (VH) is joined with a light chain variable domain (VL) by a polypeptide linker. Furthermore, Remicade was fused to a leucine zipper (RemicadeScZip) for higher production and higher antigen-binding activity than Remicade. The Remicade and Remicade ScZip were expressed in Escherichia coli and purified by a Ni+-NTA-agarose column. As expected, the purified proteins had migrated as 28.80 kDa and 33.96 kDa in sodium dodecyl sulfate-polyacrylamide electrophoresis. The TNF-α antigen binding activity of Remicade was not observed by ELISA and western blot. In contrast, RemicadeScZip showed antigen-binding activity. Additional bio-layer interferometry analysis confirmed the antigen-binding activity of RemicadeScZip, suggesting that the leucine zipper stabilized the folding of RemicadeScZip in a denatured condition and improved the TNF-α antigenbinding activity.

Study on Material Characteristic of Daegu Modern History Museum Collection Rickshaw (대구근대역사관 소장 인력거 재질분석 연구)

  • Lee, Ui Cheon;Lee, Yeong Ju;Kim, Soo Chul
    • Journal of Conservation Science
    • /
    • v.38 no.2
    • /
    • pp.133-143
    • /
    • 2022
  • In this study, we analyzed the rickshaw (Owned by the Daegu Modern History Museum) by measuring each material. The purpose of the study was to identify the materials in modern cultural assets that utilize a variety of materials in a complex way, and establish basic data for preservation and management. Using portable X-ray fluorescence analyzers (P-XRF), species identification, fiber identification, paint film analysis (microscope observation, SEM-EDS, FTIR) on metal, wood, fiber and paint was carried out. Brass, an alloy of Copper, Zinc and Iron, was measured in the metal parts. Further, wooden parts, such as Oak (Quercus acutissima), Japanese Cedar (Cryptomeria japonica), Bamboo (Bambusoideae). Torreya nucifera (Torreya spp.) were identified in the body. Fiber parts consisted mainly of cotton, but some parts were also made of leather. In terms of paint, rickshaws were applied with multiple layers, using cashew (synthetic paint used in place of lacquer). In sum, the rickshaw body part appeared to overlap with layers of fiber, metal (soild), paint, and colored (black, red) layer.

Feasibility of Deep Learning Algorithms for Binary Classification Problems (이진 분류문제에서의 딥러닝 알고리즘의 활용 가능성 평가)

  • Kim, Kitae;Lee, Bomi;Kim, Jong Woo
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.1
    • /
    • pp.95-108
    • /
    • 2017
  • Recently, AlphaGo which is Bakuk (Go) artificial intelligence program by Google DeepMind, had a huge victory against Lee Sedol. Many people thought that machines would not be able to win a man in Go games because the number of paths to make a one move is more than the number of atoms in the universe unlike chess, but the result was the opposite to what people predicted. After the match, artificial intelligence technology was focused as a core technology of the fourth industrial revolution and attracted attentions from various application domains. Especially, deep learning technique have been attracted as a core artificial intelligence technology used in the AlphaGo algorithm. The deep learning technique is already being applied to many problems. Especially, it shows good performance in image recognition field. In addition, it shows good performance in high dimensional data area such as voice, image and natural language, which was difficult to get good performance using existing machine learning techniques. However, in contrast, it is difficult to find deep leaning researches on traditional business data and structured data analysis. In this study, we tried to find out whether the deep learning techniques have been studied so far can be used not only for the recognition of high dimensional data but also for the binary classification problem of traditional business data analysis such as customer churn analysis, marketing response prediction, and default prediction. And we compare the performance of the deep learning techniques with that of traditional artificial neural network models. The experimental data in the paper is the telemarketing response data of a bank in Portugal. It has input variables such as age, occupation, loan status, and the number of previous telemarketing and has a binary target variable that records whether the customer intends to open an account or not. In this study, to evaluate the possibility of utilization of deep learning algorithms and techniques in binary classification problem, we compared the performance of various models using CNN, LSTM algorithm and dropout, which are widely used algorithms and techniques in deep learning, with that of MLP models which is a traditional artificial neural network model. However, since all the network design alternatives can not be tested due to the nature of the artificial neural network, the experiment was conducted based on restricted settings on the number of hidden layers, the number of neurons in the hidden layer, the number of output data (filters), and the application conditions of the dropout technique. The F1 Score was used to evaluate the performance of models to show how well the models work to classify the interesting class instead of the overall accuracy. The detail methods for applying each deep learning technique in the experiment is as follows. The CNN algorithm is a method that reads adjacent values from a specific value and recognizes the features, but it does not matter how close the distance of each business data field is because each field is usually independent. In this experiment, we set the filter size of the CNN algorithm as the number of fields to learn the whole characteristics of the data at once, and added a hidden layer to make decision based on the additional features. For the model having two LSTM layers, the input direction of the second layer is put in reversed position with first layer in order to reduce the influence from the position of each field. In the case of the dropout technique, we set the neurons to disappear with a probability of 0.5 for each hidden layer. The experimental results show that the predicted model with the highest F1 score was the CNN model using the dropout technique, and the next best model was the MLP model with two hidden layers using the dropout technique. In this study, we were able to get some findings as the experiment had proceeded. First, models using dropout techniques have a slightly more conservative prediction than those without dropout techniques, and it generally shows better performance in classification. Second, CNN models show better classification performance than MLP models. This is interesting because it has shown good performance in binary classification problems which it rarely have been applied to, as well as in the fields where it's effectiveness has been proven. Third, the LSTM algorithm seems to be unsuitable for binary classification problems because the training time is too long compared to the performance improvement. From these results, we can confirm that some of the deep learning algorithms can be applied to solve business binary classification problems.

Manufacture of Spent Layer Chicken Meat Products by Natural Freeze-Drying during Winter (겨울철 자연 동결 건조에 의한 노계 육제품의 제조)

  • Lee, Sung-Ki;Kang, Sun-Moon;Lee, Ik-Sun;Seo, Dong-Kwan;Kwon, Il-Kyung;Pan, Jo-No;Kim, Hee-Ju;Ga, Cheon-Heung;Pak, Jae-In
    • Food Science of Animal Resources
    • /
    • v.30 no.2
    • /
    • pp.277-285
    • /
    • 2010
  • The objective of this study was to manufacture spent layer chicken meat products by natural freeze-drying. The spent layers of chickens that were slaughtered at 80 wk were obtained from a local slaughter house and separated into two halves of carcasses. The samples were divided into the following groups: 1) control (non-curing), 2) curing, and 3) curing with 2% trehalose before drying. The cured meats were placed at $2^{\circ}C$ for 7 d and then transferred to a natural drying spot located in Injae City, Gangwondo, Korea. The experiment was conducted from January to March in 2008. The average temperature, RH, and wind speed were $-1.5^{\circ}C$, 63%, and 1.8 m/sec, respectively. The cured treatments showed higher pH, lower Aw and lower shear force value compared with the control. Based on the results of TBARS (2-thiobarbituric acid reactive substances) level and volatile basic nitrogen value, lipid oxidation and protein deterioration were inhibited in curing treatments during drying. Trehalose acted as a humectant because it maintained a lower water activity despite the relatively higher moisture content during drying. The polyunsaturated fatty acids content and sensory attributes were higher in cured treatments than in the control during drying. Most of the bacterial counts in the treated groups were lower by 2 Log CFU/g after 1 mon of drying, and Salmonella spp. and Listeria spp. were not found in any treatment. There was also no microbial safety problem associated with dried meat products. Based on the results of this experiment, dried meat products could be manufactured from precured spent layer chickens by natural freeze-drying during winter.

A Geophysical Study on the Geotectonics and Opening Mechanism of the Ulleung Basin, East Sea (동해 울릉분지의 지구조 및 성인에 관한 지구물리학적 연구)

  • Suh, Man-Cheol;Lee, Gwang-Hoon;Shon, Ho-Woong
    • The Sea:JOURNAL OF THE KOREAN SOCIETY OF OCEANOGRAPHY
    • /
    • v.3 no.1
    • /
    • pp.34-44
    • /
    • 1998
  • Analysis of gravity, magnetic, and seismic reflection data from the Ulleung Basin, East Sea has provided some insights into the opening mechanism and crustal type of the basin. Free-air gravity anomaly data show positive anomalies of about 40~60 mgal near the Korea Plateau and Oki Bank and of about -20~20 mgal in the central basin. Bouguer gravity anomaly data exhibit NE-SW trending positive anomalies of about 150 mgal in the central basin which is interpreted to be related to high-density crustal material. Abrupt changes in both Free-air and Bouguer gravity anomaly profiles across the basin margins may be due to transition between continental and oceanic crusts. Magnetic anomalies in the basin are generally less than -400 nT. No stripe pattern is evident in the magnetic anomaly map but a NW-SE trending symmetric pattern is seen in some magnetic profiles. The symmetric pattern is probably associated with the high-density crustal material in the central basin suggested by Bouguer gravity anomaly. The acoustic basement in the deep part of the basin has only a small amount of local relief. No graben or half-graben structures are seen in the acoustic basement from which mechanical extension might be inferred. The lack of high-relief structures in the acoustic basement may suggest that the basin is underlain by oceanic crust or that the basement is overlain by thick volcanic layer which obscures the structures and relief of the basement. High-density crust in the central basin inferred from gravity data, abrupt changes in gravity anomalies across the basin margins, symmetric pattern seen in some magnetic anomaly profiles, and lack of relief in the acoustic basement may suggest sea-floor spreading origin of the Ulleung Basin.

  • PDF

The Influence of the Commercial Flame Retardant to the Physical and Chemical Properties of Dancheong Pigments (시판용 방염제 도포에 의한 단청안료의 물리화학적 변화 연구)

  • Lee, Han Hyoung;Kim, Jin Gyu;Lee, Hwa Soo;Lee, Ha Rim;Chung, Yong Jae;Kim, Do Rae;Han, Gyu Seong
    • Journal of Conservation Science
    • /
    • v.32 no.2
    • /
    • pp.249-259
    • /
    • 2016
  • Effect of the flame retardants on Dancheong is studied in the present work. Two kinds of flame retardants were applied on Dancheong specimens and compared with control groups on which distilled water was applied instead of the flame retardants. The flame retardants enhanced the hygroscopic property of the surface of Dancheong. Furthermore, the added flame retardants reacted with oyster shell white($CaCO_3$) and lead red($Pb_3O_4$), producing new chemical compounds like Calcium phosphate tribasic and Lead Phosphates which make the painted layer of Dancheong dissolving and whitening over certain period of time. When applied in excessive amount and exposed in repetitive wet and dry condition, especially, they aggravate the surface problems significantly. These results will provide a good reference on the study of the discoloring/whitening effect of Dancheong layers at many traditional wooden building in Korea.

Assessing the Impact of Climate Change on Water Resources: Waimea Plains, New Zealand Case Example

  • Zemansky, Gil;Hong, Yoon-Seeok Timothy;Rose, Jennifer;Song, Sung-Ho;Thomas, Joseph
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2011.05a
    • /
    • pp.18-18
    • /
    • 2011
  • Climate change is impacting and will increasingly impact both the quantity and quality of the world's water resources in a variety of ways. In some areas warming climate results in increased rainfall, surface runoff, and groundwater recharge while in others there may be declines in all of these. Water quality is described by a number of variables. Some are directly impacted by climate change. Temperature is an obvious example. Notably, increased atmospheric concentrations of $CO_2$ triggering climate change increase the $CO_2$ dissolving into water. This has manifold consequences including decreased pH and increased alkalinity, with resultant increases in dissolved concentrations of the minerals in geologic materials contacted by such water. Climate change is also expected to increase the number and intensity of extreme climate events, with related hydrologic changes. A simple framework has been developed in New Zealand for assessing and predicting climate change impacts on water resources. Assessment is largely based on trend analysis of historic data using the non-parametric Mann-Kendall method. Trend analysis requires long-term, regular monitoring data for both climate and hydrologic variables. Data quality is of primary importance and data gaps must be avoided. Quantitative prediction of climate change impacts on the quantity of water resources can be accomplished by computer modelling. This requires the serial coupling of various models. For example, regional downscaling of results from a world-wide general circulation model (GCM) can be used to forecast temperatures and precipitation for various emissions scenarios in specific catchments. Mechanistic or artificial intelligence modelling can then be used with these inputs to simulate climate change impacts over time, such as changes in streamflow, groundwater-surface water interactions, and changes in groundwater levels. The Waimea Plains catchment in New Zealand was selected for a test application of these assessment and prediction methods. This catchment is predicted to undergo relatively minor impacts due to climate change. All available climate and hydrologic databases were obtained and analyzed. These included climate (temperature, precipitation, solar radiation and sunshine hours, evapotranspiration, humidity, and cloud cover) and hydrologic (streamflow and quality and groundwater levels and quality) records. Results varied but there were indications of atmospheric temperature increasing, rainfall decreasing, streamflow decreasing, and groundwater level decreasing trends. Artificial intelligence modelling was applied to predict water usage, rainfall recharge of groundwater, and upstream flow for two regionally downscaled climate change scenarios (A1B and A2). The AI methods used were multi-layer perceptron (MLP) with extended Kalman filtering (EKF), genetic programming (GP), and a dynamic neuro-fuzzy local modelling system (DNFLMS), respectively. These were then used as inputs to a mechanistic groundwater flow-surface water interaction model (MODFLOW). A DNFLMS was also used to simulate downstream flow and groundwater levels for comparison with MODFLOW outputs. MODFLOW and DNFLMS outputs were consistent. They indicated declines in streamflow on the order of 21 to 23% for MODFLOW and DNFLMS (A1B scenario), respectively, and 27% in both cases for the A2 scenario under severe drought conditions by 2058-2059, with little if any change in groundwater levels.

  • PDF