• Title/Summary/Keyword: Software Requirements

Search Result 1,253, Processing Time 0.03 seconds

Semantic Network Analysis of Presidential Debates in 2007 Election in Korea (제17대 대통령 후보 합동 토론 언어네트워크 분석 - 북한 관련 이슈를 중심으로)

  • Park, Sung-Hee
    • Korean journal of communication and information
    • /
    • v.45
    • /
    • pp.220-254
    • /
    • 2009
  • Presidential TV debates serve as an important instrument for the general viewers to evaluate the candidates’ character, to examine their policy, and finally to make an important political decisions to cast ballots. Every words candidates utter in the course of entire election campaign exert influence of a certain significance by delivering their ideas and by creating clashes with their respective opponents. This study focuses on the conceptual venue, coined as ‘stasis’ by ancient rhetoricians, in which the clashes take place, and examines the words selection made by each candidates, the manners in which they form stasis, call for evidence, educate the public, and finally create a legitimate form of political argumentation. The study applied computer based content analysis using KrKwic and UCINET software to analyze semantic networks among the candidates. The results showed three major candidates, namely Lee Myung Bak, Jung Dong Young, and Lee Hoi Chang, displayed separate patterns in their use of language, by selecting the words that are often neglected by their opponents. Apparently, the absence of stasis and the lack of speaking mutual language significantly undermined the effects of debates. Central questions regarding issues of North Korea failed to meet basic requirements, and the respondents failed to engage in effective argumentation process.

  • PDF

NetFPGA based capsulator Implementation and its performance evaluation for Future Internet OpenFlow Testbed (미래인터넷 OpenFlow 테스트베드 구축을 위한 NetFPGA기반 캡슐레이터 구현 및 성능평가)

  • Choi, Yun-Chul;Min, Seok-Hong;Kim, Byung-Chul;Lee, Jae-Yong;Kim, Dae-Young
    • Journal of the Institute of Electronics Engineers of Korea TC
    • /
    • v.47 no.7
    • /
    • pp.118-127
    • /
    • 2010
  • Current TCP/IP-based Internet architecture has been used for over 30 years, however it will confront with fundamental problems due to new protocol extension limitation since communication environments will change drastically and various user requirements will be emerging in near future. To solve these problems, major countries have started Future Internet researches based on clean slate approach and they will deploy large-scale testbed to experiment and verify new functions. OpenFlow switch technology has been proposed as a new experimental technology for independent protocol that can utilized the legacy network devices and does not interfere with the production Internet traffic. Korea also started Future Internet testbed project called FIRST and OpenFlow switch with NetFPGA card will be used to deploy this testbed. To interconnect distributed testbed using OpenFlow switches, logical tunnel should be established by encapsulating MAC frame inside a unicast IP packet between OpenFlow switches because OpenFlow switches are not directly connected. In this paper, we have implemented a NetFPGA-based that performs MAC in IP tunneling between various OpenFlow switch sites implemented in domestic research network KOREN. The performance evaluation shows that the NetFPGA-based capsulator reveals better performance than the software-based tunneling and it can be utilized as a testbed for experimentation of Future Internet technologies.

A Study on the Development of Road Traffic Safety Moderator Robot using AHP (AHP분석을 통한 도로 교통안전지킴이 로봇 개발에 관한 연구)

  • Lee, Young Woo;Kwon, Hyuck Jun
    • International Journal of Highway Engineering
    • /
    • v.16 no.6
    • /
    • pp.159-167
    • /
    • 2014
  • PURPOSES : The purpose of this study is to analyze the requirements of the road traffic safety moderator robot when road repairing. The road traffic safety moderator robot is road traffic safety equipment and to conversions mechanical engineering and IT when road repairing. METHODS : The study used AHP based on the survey from road repair related expert that field engineers, design engineers, public officials and professors. The survey used paired comparison. The survey items were safety, convenience and economics. The safety is classified as visibility or efficiency, the convenience is classified as utility or mobility and the economics is classified as initial investment cost or maintenance $ management cost. The survey alternatives were the road traffic safety moderator robot, traffic regulation by human, traffic regulation by mannequin and traffic signs. The software for AHP is Expert Choice 2000. RESULTS : The results of AHP analyze, the weighted value of safety was analyzed with the highest at 0.488 of survey items. The weighted value of convenience was analyzed at 0.295, the weighted value of economics was analyzed at 0.218. The results of the road traffic safety moderator robot, the weight value of efficiency and utility were analyzed with the highest at 0.284 and 0.259 of alternatives. The weighted value of initial investment cost and maintenance & management cost were analyzed with the lowest at 0.203 and 0.211 of alternatives. The consistency test results of each items, null hypothesis is rejected because the CR values were 0.000 respectively. Therefore, the study results are consistency. CONCLUSIONS : The result of this study, overall value of the road traffic safety moderator robot came off second-best of other alternatives. The road traffic safety moderator robot has been received highest praise by the result of the study as good road traffic safety equipment when road repairing because the weighted values of efficiency and utility were analyzed with the highest of survey items. The efficiency mean securing safety and the utility mean practical assistance when road repairing. The results of this study showed that the road traffic safety moderator robot will effective for traffic safety when road repairing. economics and visibility are that supplementation of the road traffic safety moderator robot because the weighted values of economics and visibility were analyzed with the lowest of survey items. The consistency test results are consistency because the CR values were 0.000 respectively.

A Study on Nutrient Intake Status and Food Sources of Iron by Dietary Iron Density of High School Girls in Seoul (서울지역 여고생들의 식이 철분밀도에 따른 영양섭취상태 및 철분 급원식품에 관한 연구)

  • Kim, Chun-Soo;Hong, Hee-Ok;Kim, Jung-Yoon;Maeng, Won-Jai;Lee, Jung-Sug
    • Journal of Nutrition and Health
    • /
    • v.40 no.4
    • /
    • pp.371-384
    • /
    • 2007
  • This study was conducted to examine nutrient intake status and iron food sources by dietary iron density of high school girls in Seoul. The subjects of 226 girls were divided into High group (${\geq}$ 6 mg/1,000 kcal, N=115) and Low group (< 6 mg/1,000 kcal, N=111) by dietary iron density. The nutrient intake data obtained by 24-hour recall method were analyzed by Can Pro 3.0 software. Mean age of all subjects was 16.4 years old, heights and weights of High group and Low group were 164.5 cm, 53.4 kg and 161.7 cm, 51.7 kg, respectively. The body mass index (BMI) of High group and Low group was 20.5 kg/m$^2$ and 19.8 kg/m$^2$, respectively. Most nutrient intakes except energy and lipid intakes of High group were higher than those of Low group. High group showed significantly higher intakes of total iron, vegetable iron and animal iron than Low group. Ca and folate intakes of High group were under 75% of the recommended intake (RI) and Ca, iron, folate and vitamin C intakes of Low group were under 65% of RI. The percentage of subjects who consumed iron less than estimated average requirements (EAR) were 40.0% in High group and 77.5% in Low group. Total food intakes of High group showed higher than that of Low group. Total animal food intakes were significantly higher and total vegetable food intakes were significantly lower in Low group than those of High group. Iron intake from meats, fishes, shell fishes and seasonings were significantly higher in High group than Low group. Iron intake from milk and dairy products were significantly lower in High group than Low group. Major food sources of iron were rice, bean curd, pork, and egg in order among both groups.

Proposal of Big Data Analysis and Visualization Technique Curriculum for Non-Technical Majors in Business Management Analysis (경영분석 업무에 종사하는 비 기술기반 전공자를 위한 빅데이터 분석 및 시각화 기법 교육과정 제안)

  • Hong, Pil-Tae;Yu, Jong-Pil
    • Journal of Practical Engineering Education
    • /
    • v.12 no.1
    • /
    • pp.31-39
    • /
    • 2020
  • Big data analysis is analyzed and used in a variety of management and industrial sites, and plays an important role in management decision making. The job competency of big data analysis personnel engaged in management analysis work does not necessarily require the acquisition of microscopic IT skills, but requires a variety of experiences and humanities knowledge and analytical skills as a Data Scientist. However, big data education by state-run and state-run educational institutions and job education institutions based on the National Competency Standards (NCS) is proceeding in terms of software engineering, and this teaching methodology can have difficult and inefficient consequences for non-technical majors. Therefore, we analyzed the current Big Data platform and its related technologies and defined which of them are the requisite job competency requirements for field personnel. Based on this, the education courses for big data analysis and visualization techniques were organized for non-technical-based majors. This specialized curriculum was conducted by working-level officials of financial institutions engaged in management analysis at the management site and was able to achieve better educational effects The education methods presented in this study will effectively carry out big data tasks across industries and encourage visualization of big data analysis for non-technical professionals.

Customizable Global Job Scheduler for Computational Grid (계산 그리드를 위한 커스터마이즈 가능한 글로벌 작업 스케줄러)

  • Hwang Sun-Tae;Heo Dae-Young
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.33 no.7
    • /
    • pp.370-379
    • /
    • 2006
  • Computational grid provides the environment which integrates v 따 ious computing resources. Grid environment is more complex and various than traditional computing environment, and consists of various resources where various software packages are installed in different platforms. For more efficient usage of computational grid, therefore, some kind of integration is required to manage grid resources more effectively. In this paper, a global scheduler is suggested, which integrates grid resources at meta level with applying various scheduling policies. The global scheduler consists of a mechanical part and three policies. The mechanical part mainly search user queues and resource queues to select appropriate job and computing resource. An algorithm for the mechanical part is defined and optimized. Three policies are user selecting policy, resource selecting policy, and executing policy. These can be defined newly and replaced with new one freely while operation of computational grid is temporarily holding. User selecting policy, for example, can be defined to select a certain user with higher priority than other users, resource selecting policy is for selecting the computing resource which is matched well with user's requirements, and executing policy is to overcome communication overheads on grid middleware. Finally, various algorithms for user selecting policy are defined only in terms of user fairness, and their performances are compared.

Fast Selection of Composite Web Services Based on Workflow Partition (워크플로우 분할에 기반한 복합 웹 서비스의 빠른 선택)

  • Jang, Jae-Ho;Shin, Dong-Hoon;Lee, Kyong-Ho
    • Journal of KIISE:Software and Applications
    • /
    • v.34 no.5
    • /
    • pp.431-446
    • /
    • 2007
  • Executable composite Web services are selected by binding a given abstract workflow with the specific Web services that satisfy given QoS requirements. Considering the rapidly increasing number of Web services and their highly dynamic QoS environment, the fast selection of composite services is important. This paper presents a method for quality driven comosite Web services selection based on a workflow partition strategy. The proposed method partitions an abstract workflow into two sub-workflows to decrease the number of candidate services that should be considered. The QoS requirement is also decomposed for each partitioned workflow. Since the decomposition of a QoS requirement is based on heuristics, the selection might fail to find composite Web services. To avoid such a failure, the tightness of a QoS requirement is defined and a workflow is determined to be partitioned according to the tightness. A mixed integer linear programming is utilized for the efficient service selection. Experimental results show that the success rate of partitioning is above 99%. Particularly, the proposed method performs faster and selects composite services whose qualities are not significantly different (less than 5%) from the optimal one.

Web Learning Systems Development based on Product Line (프로덕트 라인 기반의 웹 학습 시스템 개발)

  • Kim Haeng-Hon;Kim Su-Youn
    • The KIPS Transactions:PartD
    • /
    • v.12D no.4 s.100
    • /
    • pp.589-600
    • /
    • 2005
  • Application developers need effective reuseable methodology to meet rapidly changes and variety of users requirements. Product Line and CBD(Component Based Development) offer the great benefits on quality and productivity for developing the software that is mainly associate with reusable architectures and components in a specific domain and rapidly changing environments. Product line can dynamically focus on the commonality and variety feature model among the products. The product line uses the feature modeling for discovering, analyzing, and mediating interactions between products. Reusable architectures include many variety plans and mechanisms. In case of those architecture are use in product version for a long time, It is very important in architecture product line context for product line design phase. Application developer need to identify the proper location of architecture changing for variety expression. It is lack of specific variety managements to design the product line architecture until nowdays. In this paper, we define various variety types to identify the proper location of architecture changing for variety expression and to design the reusable architecture. We also propose architecture variety on feature model and describe variety expression on component relations. We implemented the web learning system based on the methodology. We finally describe how these methodology may assist in increasing the efficiency, reusability, productivity and quality to develop an application. In the future, we are going to apply the methodology into various domain and suggest international and domestic's standardization.

Component-Based Systematic Reengineering Process (컴포넌트 기반의 체계적인 재공학 프로세스)

  • Cha Jung-Jun;Kim Chul Hong;Yang Young-Jong
    • The KIPS Transactions:PartD
    • /
    • v.12D no.7 s.103
    • /
    • pp.947-956
    • /
    • 2005
  • Software(S/W) reengineering is one of the effective technologies to produce a business worth and en and the S/W ROI continuously. In spite of, S/W reengineering has been recognized a cost-consumptive works with inefficient productivity. In fact we have used to transform to confusion system with destructive system architecture by extending and updating legacy system in a temporary expedients. Moreover it is impossible to provide the time-market products for coping with rapid changeable system environment and meeting to complicated customer's requirements. Therefore, we need a systematic reengineering methodology to fulfill the changeable environment, as appearance of new IT techniques, various alteration of business information model, and increment of business logic. Legacy systems can be utilized as the core property in business organization through reengineering methodology. In this paper, we target to establish the reengineering process, proposed MaRMI-RE consisting of initial Planning phase, reverse engineering and component transformation phase. To describe the MaRMI-RE, we presented the concrete tasks and techniques and artifacts per individual phase in process, and the case study is showed briefly.

Application of Gamma Ray Densitometry in Powder Metallurgy

  • Schileper, Georg
    • Proceedings of the Korean Powder Metallurgy Institute Conference
    • /
    • 2002.07a
    • /
    • pp.25-37
    • /
    • 2002
  • The most important industrial application of gamma radiation in characterizing green compacts is the determination of the density. Examples are given where this method is applied in manufacturing technical components in powder metallurgy. The requirements imposed by modern quality management systems and operation by the workforce in industrial production are described. The accuracy of measurement achieved with this method is demonstrated and a comparison is given with other test methods to measure the density. The advantages and limitations of gamma ray densitometry are outlined. The gamma ray densitometer measures the attenuation of gamma radiation penetrating the test parts (Fig. 1). As the capability of compacts to absorb this type of radiation depends on their density, the attenuation of gamma radiation can serve as a measure of the density. The volume of the part being tested is defined by the size of the aperture screeniing out the radiation. It is a channel with the cross section of the aperture whose length is the height of the test part. The intensity of the radiation identified by the detector is the quantity used to determine the material density. Gamma ray densitometry can equally be performed on green compacts as well as on sintered components. Neither special preparation of test parts nor skilled personnel is required to perform the measurement; neither liquids nor other harmful substances are involved. When parts are exhibiting local density variations, which is normally the case in powder compaction, sectional densities can be determined in different parts of the sample without cutting it into pieces. The test is non-destructive, i.e. the parts can still be used after the measurement and do not have to be scrapped. The measurement is controlled by a special PC based software. All results are available for further processing by in-house quality documentation and supervision of measurements. Tool setting for multi-level components can be much improved by using this test method. When a densitometer is installed on the press shop floor, it can be operated by the tool setter himself. Then he can return to the press and immediately implement the corrections. Transfer of sample parts to the lab for density testing can be eliminated and results for the correction of tool settings are more readily available. This helps to reduce the time required for tool setting and clearly improves the productivity of powder presses. The range of materials where this method can be successfully applied covers almost the entire periodic system of the elements. It reaches from the light elements such as graphite via light metals (AI, Mg, Li, Ti) and their alloys, ceramics ($AI_20_3$, SiC, Si_3N_4, $Zr0_2$, ...), magnetic materials (hard and soft ferrites, AlNiCo, Nd-Fe-B, ...), metals including iron and alloy steels, Cu, Ni and Co based alloys to refractory and heavy metals (W, Mo, ...) as well as hardmetals. The gamma radiation required for the measurement is generated by radioactive sources which are produced by nuclear technology. These nuclear materials are safely encapsulated in stainless steel capsules so that no radioactive material can escape from the protective shielding container. The gamma ray densitometer is subject to the strict regulations for the use of radioactive materials. The radiation shield is so effective that there is no elevation of the natural radiation level outside the instrument. Personal dosimetry by the operating personnel is not required. Even in case of malfunction, loss of power and incorrect operation, the escape of gamma radiation from the instrument is positively prevented.

  • PDF