• Title/Summary/Keyword: Valid Time

Search Result 724, Processing Time 0.03 seconds

Case Study On The Seismic Design Strategy For Post-Quake Functional Buildings In China

  • Peng Liu;Xue Li;Yu Cheng;Xiaoyu Gao;Jinai Zhang;Yongbin Liu
    • International Journal of High-Rise Buildings
    • /
    • v.12 no.3
    • /
    • pp.251-262
    • /
    • 2023
  • In response to China's "Regulations on the Management of Earthquake Resistance of Building Constructions" on the provision of eight types of important buildings to maintain functional after fortified earthquakes occur, "Guidelines for Seismic Design of post-quake functional buildings (Draft for Review)" distinguishes Class I and Class II buildings, and gives the performance objectives and seismic verification requirements for design earthquakes and severe earthquakes respectively. In this paper, a hospital and a school building are selected as examples to design according to the requirements of fortification of Intensity 8 and 7 respectively. Two design strategies, the seismic isolation scheme and energy dissipation scheme, are considered which are evaluated through elastic-plastic dynamic time-history analysis to meet the requirement of post-quake functional buildings. The results show that the seismic isolation design can meet the requirements in the above cases, and the energy dissipation scheme is difficult to meet the requirements of the "Guidelines" on floor acceleration in some cases, for which the scheme shall be made valid through the seismic resilience assessment. The research in this paper can provide a reference for designers to choose schemes for post-quake functional buildings.

First Record of the Velvet Snail, Coriocella jayi (Littorinimorpha: Velutinidae) from Korea

  • Yucheol Lee;Damin Lee;Jina Park;Joong-Ki Park
    • Animal Systematics, Evolution and Diversity
    • /
    • v.40 no.2
    • /
    • pp.130-134
    • /
    • 2024
  • The family Velutinidae is found in various intertidal and subtidal habitats worldwide including Arctic and Antarctic seas. They are characterized by possessing a fragile shell that is partially or entirely covered by the mantle. Eight valid species of the genus Coriocella have been reported mostly in the Indo-West Pacific. Here we report Coriocella jayi Wellens, 1996 from Korean waters for the first time and describe details of their external morphology and radula characteristics using scanning electron microscopy, and provide the mtDNA cox1 sequence as a DNA barcode sequence information. This species is distinguished from other congeneric species by having six cylinder-shaped tubercular lobes of their dorsal part of mantle body and mantle color. Phylogenetic tree using the mtDNA cox1 sequence data shows that two Coriocella species (C. jayi and C. nigra) are grouped as their respective sister among Velutinidae species, and these relationships are strongly supported by 100% bootstrap value. Despite the morphological similarities, further investigation will be needed to confirm whether the African and Korean populations can be justified as the same species with a disconnected distribution range, or represent morphologically similar but two distinct species.

Determining Transit Vehicle Dispatching Time (최적 배차시각 설정에 관한 해석적 연구)

  • Park, Jun-Sik;Go, Seung-Yeong;Kim, Jeom-San;Gwon, Yong-Seok
    • Journal of Korean Society of Transportation
    • /
    • v.25 no.3
    • /
    • pp.137-144
    • /
    • 2007
  • This study involves an analytical approach to determine transit dispatching schedules (headways) Determining a time schedule is an important process in transit system planning. In general, the transit headway should be shorter during the peak hour than at non-peak hours for demand-responsive service. It allows passengers to minimize their waiting time under inelastic, fixed demand conditions. The transit headway should be longer as operating costs increase, and shorter as demand and waiting time increase. Optimal headway depends on the amount of ridership. and each individual vehicle dispatching time depends on the distribution of the ridership. This study provides a theoretical foundation for the dispatching scheme consistent with common sense. Previous research suggested a dispatching scheme with even headway. However, according to this research, that is valid for a specific case when the demand pattern is uniform. This study is a general analysis expanding that previous research. This study suggests an easy method to set a time table without a complex and difficult calculation. Further. if the time axis is changed to the space axis instead, this study could be expanded to address the spacing problems of some facilities such as roads. stations, routes and others.

Analysis on the Correction Factor of Emission Factors and Verification for Fuel Consumption Differences by Road Types and Time Using Real Driving Data (실 주행 자료를 이용한 도로유형·시간대별 연료소모량 차이 검증 및 배출계수 보정 지표 분석)

  • LEE, Kyu Jin;CHOI, Keechoo
    • Journal of Korean Society of Transportation
    • /
    • v.33 no.5
    • /
    • pp.449-460
    • /
    • 2015
  • The reliability of air quality evaluation results for green transportation could be improved by applying correct emission factors. Unlike previous studies, which estimated emission factors that focused on vehicles in laboratory experiments, this study investigates emission factors according to road types and time using real driving data. The real driving data was collected using a Portable Activity Monitoring System (PAMS) according to road types and time, which it compared and analyzed fuel consumption from collected data. The result of the study shows that fuel consumption on national highway is 17.33% higher than the fuel consumption on expressway. In addition, the average fuel consumption of peak time is 4.7% higher than that of non-peak time for 22.5km/h. The difference in fuel consumption for road types and time is verified using ANOCOVA and MANOVA. As a result, the hypothesis of this study - that fuel consumption differs according to road types and time, even if the travel speed is the same - has proved valid. It also suggests correction factor of emission factors by using the difference in fuel consumption. It is highly expected that this study can improve the reliability of emissions from mobile pollution sources.

Analysis of Curriculum Development Processes and the Relationship between General Statements of the Curriculum and Science Curriculum (교육과정 개발 체제 및 총론과 과학과 교육과정의 연계성 분석)

  • Lee, Yang-Rak
    • Journal of The Korean Association For Science Education
    • /
    • v.24 no.3
    • /
    • pp.468-480
    • /
    • 2004
  • It has been criticized that there are discrepancy between 'general statements' of the curriculum and subject-matter curricula. The possible reasons for this are as follows: The developers of the general statements were educational curriculum specialists. These specialists were not good enough to develop general statements and guidelines of subject matter curricula reflecting the characteristics of science contents, to examine developed science curriculum, and to give feedback to science curriculum developers. Under the present curriculum developing system where curriculum is developed in ten months or less by the research team commissioned unpredictably and imminently, it might be difficult to develop valid and precise science curriculum reflecting the purport of the general statements and teachers' needs. The inadequacy of these curriculum development processes resulted in (1) inconsistent statement about the school year to be applied to differentiated curriculum, (2) abstract and ambiguous stating about the characteristics, teaching-learning and assessment guidelines of enrichment activities, and (3) failure to reduce science contents to a reasonable level. Therefore curriculum development centers should be designated in advance to do basic research at ordinary times, and organized into a cooperative system among them. Two years or more of developing time and wider participation of scientists are recommended to develop more valid and precise science curriculum. In addition, commentaries on science curriculum should be published before textbook writing begins.

Lane Detection in Complex Environment Using Grid-Based Morphology and Directional Edge-link Pairs (복잡한 환경에서 Grid기반 모폴리지와 방향성 에지 연결을 이용한 차선 검출 기법)

  • Lin, Qing;Han, Young-Joon;Hahn, Hern-Soo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.20 no.6
    • /
    • pp.786-792
    • /
    • 2010
  • This paper presents a real-time lane detection method which can accurately find the lane-mark boundaries in complex road environment. Unlike many existing methods that pay much attention on the post-processing stage to fit lane-mark position among a great deal of outliers, the proposed method aims at removing those outliers as much as possible at feature extraction stage, so that the searching space at post-processing stage can be greatly reduced. To achieve this goal, a grid-based morphology operation is firstly used to generate the regions of interest (ROI) dynamically, in which a directional edge-linking algorithm with directional edge-gap closing is proposed to link edge-pixels into edge-links which lie in the valid directions, these directional edge-links are then grouped into pairs by checking the valid lane-mark width at certain height of the image. Finally, lane-mark colors are checked inside edge-link pairs in the YUV color space, and lane-mark types are estimated employing a Bayesian probability model. Experimental results show that the proposed method is effective in identifying lane-mark edges among heavy clutter edges in complex road environment, and the whole algorithm can achieve an accuracy rate around 92% at an average speed of 10ms/frame at the image size of $320{\times}240$.

Towards Safety Based Design Procedure for Ships

  • Bakker, Marijn;Boonstra, Hotze;Engelhard, Wim;Daman, Bart
    • Journal of Ship and Ocean Technology
    • /
    • v.5 no.3
    • /
    • pp.1-13
    • /
    • 2001
  • Present-day rules and regulations for the design and construction of ships are almost without exemption of a prescriptive and deterministic nature. Often it is argued that this situation is far from ideal; it does no right to the advances, which have been made during the past decades in engineering tools in marine technology, both in methodology and in computational power. Within IMO this has been realized for some time and has resulted in proposals to use Formal Safety Assessment(FSA) as a tool to improve and to modernize the rule making process. The present paper makes use of elements of the FSA methodology, but instead of working towards generic regulations or requirements, a Risk Assessment Approach, not unlike a 'safety case'; valid for a certain ship or type of ship is worked out. Delft University of Technology investigated the application of safely assessment procedures in ship design, in co-operation with Anthony Veder Shipowners and safety experts from Safely Service Center BV. The ship considered is a semi-pressurized-fully refrigerated LPG carrier. On the basis of the assumption that a major accident occurs, various accident, scenarios were considered and assessed, which would impair the safety of the carrier. In a so-called Risk Matrix, in which accident frequencies versus the consequence of the scenarios are depicted, the calculated risks all appeared lo be in the ALARP('as low as reasonable practicable') region. A number of design alternatives were compared, both on safety merits and cost-effectiveness. The experience gained with this scenario-based approach will be used to establish a set of general requirements for safety assessment techniques in ship design. In the view that assessment results will be most probably presented in a quasi-quantified manner, the requirements are concerned with uniformity of both the safety assessment. These requirements make it possible that valid comparison between various assessment studies can be made. Safety assessment, founded on these requirements, provides a validated and helpful source of data during the coming years, and provides naval architects and engineers with tools experience and data for safety assessment procedures in ship design. However a lot of effort has to be spent in order to make the methods applicable in day-to-day practice.

  • PDF

A Study on Validity, Reliability and Practicality of a Concept Map as an Assessment Tool of Biology Concept Understandings (생물 개념 이해의 평가 도구로서 개념도의 타당도, 신뢰도 그리고 현실 적용 가능성에 대한 연구)

  • Cho, Jung-II;Kim, Jung
    • Journal of The Korean Association For Science Education
    • /
    • v.22 no.2
    • /
    • pp.398-409
    • /
    • 2002
  • The purpose of this study was to investigate the validity, reliability and practicality of a concept map as an assessment tool in the context of biology concept learning. Forty undergraduate students participated in concept mapping, and the maps were scored by preservice science teachers, using one of three different scoring methods, that is, concept map scoring methods developed by Burry-Stock, Novak & Gowin and McClure & Bell. Two scorers were assigned to each scoring method. As far as the validity of the assessment methods was concerned, two of the three methods were found to be very valid, while Burry-Stock's scoring method was shown little valid. As far as the internal consistency of the methods was concerned, considerably high consistencies were shown between every pair of scorers, judging from high correlation coefficients between the pair of scorers for each scoring method. It took from 1.13 minutes to 3.70 minutes to assess a map at the average. It showed that concept mapping could be used in school classrooms with the limited resources of time and people. These findings suggest that the concept mapping can be an appropriate tool for assessing biology concept understandings.

Problems in Quantification of Adequacy of Academic Library Collections -Critical Analysis of Standards for Academic Libraries in the U.S.- (종합대학 도서관장서의 적정량기준 설정에 관한 고찰 -미국의 종합대학도서관기준을 중심으로-)

  • Chung Young Sun
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.8
    • /
    • pp.183-207
    • /
    • 1981
  • Library standards have been the source of considerable controversy, whereas many problems are involved in developing stardard for university library collections. For evaluation purposes, standards should be precise, quantifiable and measurable. In the United States, however, standards for academic libraries are limited to qualitative statements and principles. Quantitative standards, when given, are ususally related to the number of population in the institution being served, or the prescribed quantitative objectives are often arbitrarily formulated by value judgements. The study in this paper attempts to explain the problems involved in developing quantitative standard for academic library collections. Two problems facing in the formulation of the optimal size of collection are identified. One is the theoretically faulty concept of adequacy of collection to meet the situations of diversity of university libraies, and the other is the difficulties in quantification and measurement, along with the lack of concept of adequacy of collection. However, quantification of adequate size of collection is proved to be useful on the pratical level, even though not valid theoretically. ACRL, Clapp/Jordan and Voigt developed formulas or models for setting the optimal size of a library collection for any particular university library. The main purpose of this study is the analysis of the above formulas. ACRL standard was drawn from obervation and analysis of statistcs in leading library collections. In academic field, this judgement appears to have been based on the assumption that a high-grade institution would be apt to have a good library collection. This study criticizes ACRL standard for its failure to include some determinants of measurements, and points out the limitations of the standard. In contrast. Clapp/Jordan developed a formula rather scientifically based upon bibliographical sources. This is similarly empirical but has the advantage of bringing into play the elements which make universities diverse in nature. Both ACRL and Clapp/Jordan formulas share two major defects. (1) the specific subject needs of the collection are not indiacted directly, and (2) percentage rate of growth is an indicator in measuring the potential utility of a collection. Thus both formulas failed to provide a basis for meaningful evaluation. Voigt further developed a model for determining acquisition rates for currently published materials based on bibliographic technique. Voigt model encourages experimentation with different programs and different allocations of input resources, designed to meet the needs of the library's particular population. Standard for university library collections can be formulated in terms of input(traditional indicator), or additionally, in terms of output(cost-effectiveness). Cost effectiveness is expressed as user satisfaction, ability to provide wanted materials within a reasonable time period. Thus simple quantitative method does not cover all the situations of diversity of university library collections, nor measures the effectiveness of collections. Valid standard could not be established without further research.

  • PDF

Estimation for the Variation of the Concentration of Greenhouse Gases with Modified Shannon Entropy (변형된 샤논 엔트로피식을 이용한 온실가스 농도변화량 예측)

  • Kim, Sang-Mok;Lee, Do-Haeng;Choi, Eol;Koh, Mi-Sol;Yang, Jae-Kyu
    • Journal of Environmental Science International
    • /
    • v.22 no.11
    • /
    • pp.1473-1479
    • /
    • 2013
  • Entropy is a measure of disorder or uncertainty. This terminology is qualitatively used in the understanding of its correlation to pollution in the environmental area. In this research, three different entropies were defined and characterized in order to quantify the qualitative entropy previously used in the environmental science. We are dealing with newly defined distinct entropies $E_1$, $E_2$, and $E_3$ originated from Shannon entropy in the information theory, reflecting concentration of three major green house gases $CO_2$, $N_2O$ and $CH_4$ represented as the probability variables. First, $E_1$ is to evaluate the total amount of entropy from concentration difference of each green house gas with respect to three periods, due to industrial revolution, post-industrial revolution, and information revolution, respectively. Next, $E_2$ is to evaluate the entropy reflecting the increasing of the logarithm base along with the accumulated time unit. Lastly, $E_3$ is to evaluate the entropy with a fixed logarithm base by 2 depending on the time. Analytical results are as follows. $E_1$ shows the degree of prediction reliability with respect to variation of green house gases. As $E_1$ increased, the concentration variation becomes stabilized, so that it follows from linear correlation. $E_2$ is a valid indicator for the mutual comparison of those green house gases. Although $E_3$ locally varies within specific periods, it eventually follows a logarithmic curve like a similar pattern observed in thermodynamic entropy.