• Title/Summary/Keyword: Single-step

Search Result 1,662, Processing Time 0.024 seconds

Electrostatic Immobilization of D-Xylose Isomerase to a Cation Exchanger for the Conversion of D-Xylose to D-Xylulose (D-xylose에서 D-xylulose로의 전환을 위한 D-xylose Isomerase의 정전기적 고정화)

  • Hang, Nguyen Thi;Kim, Sung-Gun;Kweon, Dae-Hyuk
    • Microbiology and Biotechnology Letters
    • /
    • v.40 no.2
    • /
    • pp.163-167
    • /
    • 2012
  • Since D-xylose is not fermentable in Saccharomyces cerevisiae, its conversion to D-xylulose is required for its application in biotechnological industries using S. cerevisiae. In order to convert D-xylose to D-xylulose by way of an enzyme immobilized system, D-xylose isomerase (XI) of Escherichia coli was fused with 10-arginine tag (R10) at its C-terminus for the simple purification and immobilization process using a cation exchanger. The fusion protein XIR10 was overexpressed in recombinant E. coli and purified to a high purity by a single step of cation exchange chromatography. The purified XIR10 was immobilized to a cation exchanger via the electrostatic interaction with the C-terminal 10-arginine tag. Both the free and immobilized XIR10 exhibited similar XI activities at various pH values and temperatures, indicating that the immobilization to the cation exchanger has a small effect on the enzymatic function of XIR10. Under optimized conditions for the immobilized XIR10, D-xylose was isomerized to D-xylulose with a conversion yield of 25%. Therefore, the results of this study clearly demonstrate that the electrostatic immobilization of XIR10 via the interaction between the 10-arginine tag and a cation exchanger is an applicable form of the conversion of D-xylose to D-xylulose.

Study of the Process Model through a Case Study of the EU FP 7th International Technological Cooperation - Focusing on the Sub Processes in the Evaluation Process - (EU FP 7차 사례분석을 통해서 살펴본 국제기술협력 프로세스 모형에 관한 연구 -평가단계 하부프로세스를 중심으로-)

  • Kim, Jin Suk
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.15 no.12
    • /
    • pp.7012-7017
    • /
    • 2014
  • The process of international technical cooperation is quite complex. If the complexity is reduced and mirrored as a simple model, it could be of great assistance to government agencies and staff to support international technical cooperation. The purpose of this paper was to develop the underlying process model in international technical cooperation in the evaluation phase. The paper provides in the second chapter the theoretical background for developing a process model. In the third chapter, the features and the practical model of the evaluation phase are presented. Chapter four presents a case with substantial empirical analysis of the development model. EU FP7th as a case study was examined in the hindsight of international technical cooperation in the evaluation process. The concluding, chapter five, presents the limitations and future research directions. The results show that a new phase can be analyzed on the level of the evaluation step. The limitation of this study was that the empirical study consists of a single case study only. In the future, it will be necessary to study the sub stages of other steps in international technological cooperation.

Purification, and Biochemical and Biophysical Characterization of Cellobiohydrolase I from Trichoderma harzianum IOC 3844

  • Colussi, Francieli;Serpa, Viviane;Da Silva Delabona, Priscila;Manzine, Livia Regina;Voltatodio, Maria Luiza;Alves, Renata;Mello, Bruno Luan;Nei, Pereira Jr.;Farinas, Cristiane Sanches;Golubev, Alexander M.;Santos, Maria Auxiliadora Morim;Polikarpov, Igor
    • Journal of Microbiology and Biotechnology
    • /
    • v.21 no.8
    • /
    • pp.808-817
    • /
    • 2011
  • Because of its elevated cellulolytic activity, the filamentous fungus Trichoderma harzianum has a considerable potential in biomass hydrolysis applications. Trichoderma harzianum cellobiohydrolase I (ThCBHI), an exoglucanase, is an important enzyme in the process of cellulose degradation. Here, we report an easy single-step ion-exchange chromatographic method for purification of ThCBHI and its initial biophysical and biochemical characterization. The ThCBHI produced by induction with microcrystalline cellulose under submerged fermentation was purified on DEAE-Sephadex A-50 media and its identity was confirmed by mass spectrometry. The ThCBHI biochemical characterization showed that the protein has a molecular mass of 66 kDa and pI of 5.23. As confirmed by smallangle X-ray scattering (SAXS), both full-length ThCBHI and its catalytic core domain (CCD) obtained by digestion with papain are monomeric in solution. Secondary structure analysis of ThCBHI by circular dichroism revealed ${\alpha}$- helices and ${\beta}$-strands contents in the 28% and 38% range, respectively. The intrinsic fluorescence emission maximum of 337 nm was accounted for as different degrees of exposure of ThCBHI tryptophan residues to water. Moreover, ThCBHI displayed maximum activity at pH 5.0 and temperature of $50^{\circ}C$ with specific activities against Avicel and p-nitrophenyl-${\beta}$-D-cellobioside of 1.25 U/mg and 1.53 U/mg, respectively.

The Protective Effects of Curcuma longa Linn. Extract on Carbon Tetrachloride-Induced Hepatotoxicity in Rats via Upregulation of Nrf2

  • Lee, Hyeong-Seon;Li, Li;Kim, Hyun-Kyung;Bilehal, Dinesh;Li, Wei;Lee, Dong-Seok;Kim, Yong-Ho
    • Journal of Microbiology and Biotechnology
    • /
    • v.20 no.9
    • /
    • pp.1331-1338
    • /
    • 2010
  • This study was designed to investigate the potentially protective effects of Curcuma longa Linn. extract (CLE) on carbon tetrachloride ($CCl_4$)-induced hepatotoxicity in rats. Male Sprague-Dawley rats were pretreated with 50 or 100mg/kg of CLE or 100mg/kg of butylated hydroxytoluene(BHT) for 14 days before $CCl_4$ administration. In addition, the CLE control group was pretreated with 100mg/kg CLE for only 14 days. Three hours after the final treatment, a single dose of $CCl_4$ (20mg/kg) was administrated intraperitoneally to each group. After the completion of this phase of the experiment, food and water were removed 12 h prior to the next step. The rats were then anesthetized by urethane and their blood and liver were collected. It was observed that the aspartate aminotransferase and alanine aminotransferase activities of the serum, and the hepatic malondialdehyde levels had significantly decreased in the CLE group when compared with the $CCl_4$-treated group. The antioxidant activities, such as superoxide dismutase, catalase, and glutathione peroxidase activities, in addition to glutathione content, had increased considerably in the CLE group compared with the $CCl_4$-treated group. Phase II detoxifying enzymes, such as glutathione S-transferase, were found to have significantly increased in the CLE group as opposed to the $CCl_4$-treated group. The content of Nrf2 was determined by Western blot analysis. Pretreated CLE increased the level of nuclear translocated Nrf2, and the Nrf2 then increased the activity of the antioxidant and phase II detoxifying enzymes. These results indicate that CLE has protective effects against $CCl_4$-induced hepatotoxicity in rats, via activities of antioxidant and phase II detoxifying enzymes, and through the activation of nuclear translocated Nrf2.

Comparative Analysis of Korean Universities' Co-author Credit Allocation Standards on Journal Publications (국내대학의 학술논문 공동연구 기여도 산정 기준 비교 분석)

  • Lee, Hyekyung;Yang, Kiduk
    • Journal of Korean Library and Information Science Society
    • /
    • v.46 no.4
    • /
    • pp.191-205
    • /
    • 2015
  • As the first step in developing the optimal co-authorship allocation method, this study investigated the co-authorship allocation standards of Korean Universities on journal publications. The study compared the standards of 27 Korean universities with Library and Information Science (LIS) departments, and analyzed author rankings generated by applying inflated, fractional, harmonic, and university standard method of co-authorship allocation to 189 Korean LIS faculty publications from 2001 to 2014. The university standards most similar to the standard co-authorship allocation method in bibliometrics(i.e. Vinkler) were those whose co-author credits summed up to 1. However, the university standards differed from Vinkler's in allocating author credits based on primary and secondary author classification instead of allocation based on author ranks. The statistical analysis of author rankings showed that the harmonic method was most similar to the university standards. However, the correlation between the university standards whose co-author credits summed up to greater than 1 and harmonic method was lower. The study results also suggested that middle-level authors are most sensitive to co-authorship allocation methods. However, even the most generous university standards of co-authorship allocation still penalizes collaborative research by reducing each co-authors credit below those of single authors. Follow-up studies will be needed to investigate the optimal method of co-authorship credit allocation.

Example-based Super Resolution Text Image Reconstruction Using Image Observation Model (영상 관찰 모델을 이용한 예제기반 초해상도 텍스트 영상 복원)

  • Park, Gyu-Ro;Kim, In-Jung
    • The KIPS Transactions:PartB
    • /
    • v.17B no.4
    • /
    • pp.295-302
    • /
    • 2010
  • Example-based super resolution(EBSR) is a method to reconstruct high-resolution images by learning patch-wise correspondence between high-resolution and low-resolution images. It can reconstruct a high-resolution from just a single low-resolution image. However, when it is applied to a text image whose font type and size are different from those of training images, it often produces lots of noise. The primary reason is that, in the patch matching step of the reconstruction process, input patches can be inappropriately matched to the high-resolution patches in the patch dictionary. In this paper, we propose a new patch matching method to overcome this problem. Using an image observation model, it preserves the correlation between the input and the output images. Therefore, it effectively suppresses spurious noise caused by inappropriately matched patches. This does not only improve the quality of the output image but also allows the system to use a huge dictionary containing a variety of font types and sizes, which significantly improves the adaptability to variation in font type and size. In experiments, the proposed method outperformed conventional methods in reconstruction of multi-font and multi-size images. Moreover, it improved recognition performance from 88.58% to 93.54%, which confirms the practical effect of the proposed method on recognition performance.

Improving Haskell GC-Tuning Time Using Divide-and-Conquer (분할 정복법을 이용한 Haskell GC 조정 시간 개선)

  • An, Hyungjun;Kim, Hwamok;Liu, Xiao;Kim, Yeoneo;Byun, Sugwoo;Woo, Gyun
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.6 no.9
    • /
    • pp.377-384
    • /
    • 2017
  • The performance improvement of a single core processor has reached its limit since the circuit density cannot be increased any longer due to overheating. Therefore, the multicore and manycore architectures have emerged as viable approaches and parallel programming becomes more important. Haskell, a purely functional language, is getting popular in this situation since it naturally supports parallel programming owing to its beneficial features including the implicit parallelism in evaluating expressions and the monadic tools supporting parallel constructs. However, the performance of Haskell parallel programs is strongly influenced by the performance of the run-time system including the garbage collector. Though a memory profiling tool namely GC-tune has been suggested, we need a more systematic way to use this tool. Since GC-tune finds the optimal memory size by executing the target program with all the different possible GC options, the GC-tuning time takes too long. This paper suggests a basic divide-and-conquer method to reduce the number of GC-tune executions by reducing the search area by one-quarter for every searching step. Applying this method to two parallel programs, a maximally independent set and a K-means programs, the memory tuning time is reduced by 7.78 times with accuracy 98% on average.

A Strategy for Developing New Road Projects (경관도로 등 신개념의 도로사업 개발에 관한 연구)

  • Kim, Eung-Cheol
    • International Journal of Highway Engineering
    • /
    • v.9 no.2 s.32
    • /
    • pp.115-127
    • /
    • 2007
  • Developed countries, especially in road construction and management fields, introduce new road porjects such as National Scenic Byways Program(NSBP program) in USA and the Eco-road project in Japan. This study develops a conceptual model for deploying new road projects in Korea. The four step approach is suggested to create new road projects, including foundation of an act based on the existing Road Act, creation of new road project ideas, development of evaluation process and guidelines, and enhancement of an administrative scheme. To create new road projects, three different ways are devised; (1) designation of national roads having uniqueness in overall spectrum, (2) designation of roads having intrinsic values in a different aspect, (3) designation of single structures of engineering outcomes such as bridges, tunnels, new design techniques, considerable Value Engineering output, and well analyzed Life Cycle Cost Analysis practices. For the evaluation process, NSBP program of USA and/or Sustainable City Award program of Korea would be recommended. An administrative scheme and integrated funding process for the new road projects are also suggested based on evaluation of tasks of each team or division of Korea Ministry of Construction and Transportation.

  • PDF

Effect of the Nonlinearity of the Soft Soil on the Elastic and Inelastic Seismic Response Spectra (연약지반의 비선형성이 탄성 및 비탄성 지진응답스펙트럼에 미치는 영향)

  • Kim, Yong-Seok
    • Journal of the Earthquake Engineering Society of Korea
    • /
    • v.9 no.4 s.44
    • /
    • pp.11-18
    • /
    • 2005
  • Inelastic seismic analysis is necessary for the seismic design due to the nonlinear behavior of a structure-soil system, and the importance of the performance based design considering the soil-structure interaction is recognized for the reasonable seismic design. In this study, elastic and inelastic seismic response analyses of a single degree of freedom system on the soft soil layer were peformed considering the nonlinearity of the soil for the 11 weak or moderate, and 5 strong earthquakes scaled to the nominal peak acceleration of 0.075g, 0.15g, 0.2g and 0.3g. Seismic response analyses for the structure-soil system were peformed in one step applying the earthquake motions to the bedrock In the frequency domain, using a pseudo 3-D dynamic analysis software. Study results indicate that it is necessary to consider the nonlinear soil-structure interaction effects and to perform the performance based seismic design for the various soil layers rather than to follow the routine procedures specified in the seismic design codes. Nonlinearity of the soft soil excited with the weak earthquakes also affected significantly to the elastic and inelastic responses due to the nonlinear soil amplification of the earthquake motions, and it was pronounced especially for the elastic ones.

Content Analysis-based Adaptive Filtering in The Compressed Satellite Images (위성영상에서의 적응적 압축잡음 제거 알고리즘)

  • Choi, Tae-Hyeon;Ji, Jeong-Min;Park, Joon-Hoon;Choi, Myung-Jin;Lee, Sang-Keun
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.48 no.5
    • /
    • pp.84-95
    • /
    • 2011
  • In this paper, we present a deblocking algorithm that removes grid and staircase noises, which are called "blocking artifacts", occurred in the compressed satellite images. Particularly, the given satellite images are compressed with equal quantization coefficients in row according to region complexity, and more complicated regions are compressed more. However, this approach has a problem that relatively less complicated regions within the same row of complicated regions have blocking artifacts. Removing these artifacts with a general deblocking algorithm can blur complex and undesired regions as well. Additionally, the general filter lacks in preserving the curved edges. Therefore, the proposed algorithm presents an adaptive filtering scheme for removing blocking artifacts while preserving the image details including curved edges using the given quantization step size and content analysis. Particularly, WLFPCA (weighted lowpass filter using principle component analysis) is employed to reduce the artifacts around edges. Experimental results showed that the proposed method outperforms SA-DCT in terms of subjective image quality.