• Title/Summary/Keyword: Practical Use

Search Result 6,104, Processing Time 0.04 seconds

A Study on Middle School Students' Satisfaction and Need for Clothing section of Home Economics in the Textbook (의생활 영역에 대한 중학생의 수업만족도 및 필요도에 관한 연구)

  • Kang Mi-Hyang;Oh Kyung-Wha
    • Journal of Korean Home Economics Education Association
    • /
    • v.18 no.2 s.40
    • /
    • pp.63-77
    • /
    • 2006
  • The Purpose of this study is to provide the basic data for the improvement or the contents or clothing curriculum in the 7th technology home economics of middle school. The standard of satisfaction of students' according to the detail domains and the standard of necessity and practical use and learners' patterns of activity task suggested in textbooks were evaluated. The ninth grade 169 boy students and 336 girl students in the national capital region were participated in this survey. According to the survey results, firstly, a dress domain got the highest relative importance(28.56%) while a clothes material domain took the lowest relative importance(8.07%) among various detail domains. Secondly, the standard of satisfaction according to each detail domain fell below the average. Generally girls' satisfaction for teaching was higher than boys'. Thirdly, a clothes material domain showed the lowest necessity for textbook contents according to detail domain and other domains showed above the average. The necessity for textbook contents appeared high for boy students rather than girl students. In addition, boy and girl students did not have interest in content relevance in textbook. Especially, they could not do well and understand experiments and practices in clothing section. Finally, The degree of utilization of the activity task ill textbooks was very low. Among various activity tasks, the learning by discovering and exploring were more utilized than cooperating learning.

  • PDF

Research about CAVE Practical Use Way Through Culture Content's Restoration Process that Utilize CAVE (가상현실시스템(CAVE)을 활용한 문화 Content의 복원 과정을 통한 CAVE활용 방안에 대한 연구)

  • Kim, Tae-Yul;Ryu, Seuc-Ho;Hur, Yung-Ju
    • Journal of Korea Game Society
    • /
    • v.4 no.3
    • /
    • pp.11-20
    • /
    • 2004
  • Virtual reality that we have seen from the movies in 80's and 90's is hawing near based on the rapid progress of science together with a computer technology. Various virtual reality system developments (such as VRML, HMD FishTank, Wall Type, CAVE Type, and so on) and the advancement of those systems make for the embodiment of virtual reality that gives more sense of the real. Virtual reality is so immersive that makes people feel like they are in that environment and enable them to manipulate without experiencing the environment at first hand that is hard to experience in reality. Virtual reality can be applied to the spheres, such as education, high-level programming, remote control, surface exploration of the remote satellite, analysis of exploration data, scientific visualization, and so on. For some connote examples, there are training of a tank and an aeroplane operation, fumiture layout design, surgical operation practice, game, and so on. In these virtual reality systems, the actual operation of the human participant and virtual workspace are connected each other to the hardware that stimulates the five senses adequately to lend the sense of the immersion. There are still long way to go, however, before long it will be possible to have the same feeling in the virtual reality as human being can have by further study and effort. In this thesis, the basic definition, the general idea, and the kind of virtual reality were discussed. Especially, CAVE typed in reality that is highly immersive was analyzed in definition, and then the method of VR programming and modeling in the virtual reality system were suggested by showing the restoration process of Kyongbok Palace (as the content of the original form of the culture) that was made by KISTI(Korea Institute of Science and Technology Information) in 2003 through design process in virtual reality system. Through these processes, utilization of the immersive virtual reality system was discussed and how to take advantage of this CAVE typed virtual reality system at the moment was studied. In closing the problems that had been exposed in the process of the restoration of the cultural property were described and the utilization plan of the virtual reality system was suggested.

  • PDF

Development of Program for Renal Function Study with Quantification Analysis of Nuclear Medicine Image (핵의학 영상의 정량적 분석을 통한 신장기능 평가 프로그램 개발)

  • Song, Ju-Young;Lee, Hyoung-Koo;Suh, Tae-Suk;Choe, Bo-Young;Shinn, Kyung-Sub;Chung, Yong-An;Kim, Sung-Hoon;Chung, Soo-Kyo
    • The Korean Journal of Nuclear Medicine
    • /
    • v.35 no.2
    • /
    • pp.89-99
    • /
    • 2001
  • Purpose: In this study, we developed a new software tool for the analysis of renal scintigraphy which can be modified more easily by a user who needs to study new clinical applications, and the appropriateness of the results from our program was studied. Materials and Methods: The analysis tool was programmed with IDL5.2 and designed for use on a personal computer running Windows. For testing the developed tool and studying the appropriateness of the calculated glomerular filtration rate (GFR), $^{99m}Tc$-DTPA was administered to 10 adults in normal condition. In order to study the appropriateness of the calculated mean transit time (MTT), $^{99m}Tc-DTPA\;and\;^{99m}Tc-MAG3$ were administered to 11 adults in normal condition and 22 kidneys were analyzed. All the images were acquired with ORBITOR. the Siemens gamma camera. Results: With the developed tool, we could show dynamic renal images and time activity curve (TAC) in each ROI and calculate clinical parameters of renal function. The results calculated by the developed tool were not different statistically from the results obtained by the Siemens application program (Tmax: p=0.68, Relative Renal Function: p:1.0, GFR: p=0.25) and the developed program proved reasonable. The MTT calculation tool proved to be reasonable by the evaluation of the influence of hydration status on MTT. Conclusion: We have obtained reasonable clinical parameters for the evaluation of renal function with the software tool developed in this study. The developed tool could prove more practical than conventional, commercial programs.

  • PDF

Game Theoretic Optimization of Investment Portfolio Considering the Performance of Information Security Countermeasure (정보보호 대책의 성능을 고려한 투자 포트폴리오의 게임 이론적 최적화)

  • Lee, Sang-Hoon;Kim, Tae-Sung
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.3
    • /
    • pp.37-50
    • /
    • 2020
  • Information security has become an important issue in the world. Various information and communication technologies, such as the Internet of Things, big data, cloud, and artificial intelligence, are developing, and the need for information security is increasing. Although the necessity of information security is expanding according to the development of information and communication technology, interest in information security investment is insufficient. In general, measuring the effect of information security investment is difficult, so appropriate investment is not being practice, and organizations are decreasing their information security investment. In addition, since the types and specification of information security measures are diverse, it is difficult to compare and evaluate the information security countermeasures objectively, and there is a lack of decision-making methods about information security investment. To develop the organization, policies and decisions related to information security are essential, and measuring the effect of information security investment is necessary. Therefore, this study proposes a method of constructing an investment portfolio for information security measures using game theory and derives an optimal defence probability. Using the two-person game model, the information security manager and the attacker are assumed to be the game players, and the information security countermeasures and information security threats are assumed as the strategy of the players, respectively. A zero-sum game that the sum of the players' payoffs is zero is assumed, and we derive a solution of a mixed strategy game in which a strategy is selected according to probability distribution among strategies. In the real world, there are various types of information security threats exist, so multiple information security measures should be considered to maintain the appropriate information security level of information systems. We assume that the defence ratio of the information security countermeasures is known, and we derive the optimal solution of the mixed strategy game using linear programming. The contributions of this study are as follows. First, we conduct analysis using real performance data of information security measures. Information security managers of organizations can use the methodology suggested in this study to make practical decisions when establishing investment portfolio for information security countermeasures. Second, the investment weight of information security countermeasures is derived. Since we derive the weight of each information security measure, not just whether or not information security measures have been invested, it is easy to construct an information security investment portfolio in a situation where investment decisions need to be made in consideration of a number of information security countermeasures. Finally, it is possible to find the optimal defence probability after constructing an investment portfolio of information security countermeasures. The information security managers of organizations can measure the specific investment effect by drawing out information security countermeasures that fit the organization's information security investment budget. Also, numerical examples are presented and computational results are analyzed. Based on the performance of various information security countermeasures: Firewall, IPS, and Antivirus, data related to information security measures are collected to construct a portfolio of information security countermeasures. The defence ratio of the information security countermeasures is created using a uniform distribution, and a coverage of performance is derived based on the report of each information security countermeasure. According to numerical examples that considered Firewall, IPS, and Antivirus as information security countermeasures, the investment weights of Firewall, IPS, and Antivirus are optimized to 60.74%, 39.26%, and 0%, respectively. The result shows that the defence probability of the organization is maximized to 83.87%. When the methodology and examples of this study are used in practice, information security managers can consider various types of information security measures, and the appropriate investment level of each measure can be reflected in the organization's budget.

Valuation of Mining Investment Projects by the Real Option Approach - A Case Study of Uzbekistan's Copper Mining Industry - (실물옵션평가방법에 의한 광산투자의 가치평가 -우즈베키스탄 구리광산업의 사례연구를 중심으로-)

  • Makhkamov, Mumm Sh.;Kim, Dong-Hwan
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.8 no.6
    • /
    • pp.1634-1647
    • /
    • 2007
  • "To invest or not to invest?" Most business leaders are frequently faced with this question on new and ongoing projects. The challenge lies in deciding what projects to choose, expand, contract, defer, or abandon. The project valuation tools used in this process are vital to making the right decisions. Traditional tools such as discounted cash flow (DCF)/net present value (NPV) assume a "fixed" path ahead, but real world projects face uncertainties, forcing us to change the path often. Comparing to other traditional valuation methods, the real options approach captures the flexibility inherent to investment decisions. The use of real options has gained wide acceptance among practitioners in a number of several industries during the last few decades. Even though the options are present in all types of business decisions, it is still not considered as a proper method of valuation in some industries. Mining has been comparably slow to adopt new valuation techniques over the years. The reason fur this is not entirely clear. One possible reason is the level and types of risks in mining. Not only are these risks high, but they are also more numerous and involve natural risks compared with other industries. That is why the purpose of this study is to deal with a more practical approach to project valuation, known as real options analysis in mining industry. This paper provides a case study approach to the copper mining industry using a real options analysis. It shows how companies can minimize investment risks, exercise flexibility in decision making and maximize returns.

  • PDF

A Study for the Methodology of Analyzing the Operation Behavior of Thermal Energy Grids with Connecting Operation (열 에너지 그리드 연계운전의 운전 거동 특성 분석을 위한 방법론에 관한 연구)

  • Im, Yong Hoon;Lee, Jae Yong;Chung, Mo
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.1 no.3
    • /
    • pp.143-150
    • /
    • 2012
  • A simulation methodology and corresponding program based on it is to be discussed for analyzing the effects of the networking operation of existing DHC system in connection with CHP system on-site. The practical simulation for arbitrary areas with various building compositions is carried out for the analysis of operational features in both systems, and the various aspects of thermal energy grids with connecting operation are highlighted through the detailed assessment of predicted results. The intrinsic operational features of CHP prime movers, gas engine, gas turbine etc., are effectively implemented by realizing the performance data, i.e. actual operation efficiency in the full and part loads range. For the sake of simplicity, a simple mathematical correlation model is proposed for simulating various aspects of change effectively on the existing DHC system side due to the connecting operation, instead of performing cycle simulations separately. The empirical correlations are developed using the hourly based annual operation data for a branch of the Korean District Heating Corporation (KDHC) and are implicit in relation between main operation parameters such as fuel consumption by use, heat and power production. In the simulation, a variety of system configurations are able to be considered according to any combination of the probable CHP prime-movers, absorption or turbo type cooling chillers of every kind and capacity. From the analysis of the thermal network operation simulations, it is found that the newly proposed methodology of mathematical correlation for modelling of the existing DHC system functions effectively in reflecting the operational variations due to thermal energy grids with connecting operation. The effects of intrinsic features of CHP prime-movers, e.g. the different ratio of heat and power production, various combinations of different types of chillers (i.e. absorption and turbo types) on the overall system operation are discussed in detail with the consideration of operation schemes and corresponding simulation algorithms.

Detection of Phantom Transaction using Data Mining: The Case of Agricultural Product Wholesale Market (데이터마이닝을 이용한 허위거래 예측 모형: 농산물 도매시장 사례)

  • Lee, Seon Ah;Chang, Namsik
    • Journal of Intelligence and Information Systems
    • /
    • v.21 no.1
    • /
    • pp.161-177
    • /
    • 2015
  • With the rapid evolution of technology, the size, number, and the type of databases has increased concomitantly, so data mining approaches face many challenging applications from databases. One such application is discovery of fraud patterns from agricultural product wholesale transaction instances. The agricultural product wholesale market in Korea is huge, and vast numbers of transactions have been made every day. The demand for agricultural products continues to grow, and the use of electronic auction systems raises the efficiency of operations of wholesale market. Certainly, the number of unusual transactions is also assumed to be increased in proportion to the trading amount, where an unusual transaction is often the first sign of fraud. However, it is very difficult to identify and detect these transactions and the corresponding fraud occurred in agricultural product wholesale market because the types of fraud are more intelligent than ever before. The fraud can be detected by verifying the overall transaction records manually, but it requires significant amount of human resources, and ultimately is not a practical approach. Frauds also can be revealed by victim's report or complaint. But there are usually no victims in the agricultural product wholesale frauds because they are committed by collusion of an auction company and an intermediary wholesaler. Nevertheless, it is required to monitor transaction records continuously and to make an effort to prevent any fraud, because the fraud not only disturbs the fair trade order of the market but also reduces the credibility of the market rapidly. Applying data mining to such an environment is very useful since it can discover unknown fraud patterns or features from a large volume of transaction data properly. The objective of this research is to empirically investigate the factors necessary to detect fraud transactions in an agricultural product wholesale market by developing a data mining based fraud detection model. One of major frauds is the phantom transaction, which is a colluding transaction by the seller(auction company or forwarder) and buyer(intermediary wholesaler) to commit the fraud transaction. They pretend to fulfill the transaction by recording false data in the online transaction processing system without actually selling products, and the seller receives money from the buyer. This leads to the overstatement of sales performance and illegal money transfers, which reduces the credibility of market. This paper reviews the environment of wholesale market such as types of transactions, roles of participants of the market, and various types and characteristics of frauds, and introduces the whole process of developing the phantom transaction detection model. The process consists of the following 4 modules: (1) Data cleaning and standardization (2) Statistical data analysis such as distribution and correlation analysis, (3) Construction of classification model using decision-tree induction approach, (4) Verification of the model in terms of hit ratio. We collected real data from 6 associations of agricultural producers in metropolitan markets. Final model with a decision-tree induction approach revealed that monthly average trading price of item offered by forwarders is a key variable in detecting the phantom transaction. The verification procedure also confirmed the suitability of the results. However, even though the performance of the results of this research is satisfactory, sensitive issues are still remained for improving classification accuracy and conciseness of rules. One such issue is the robustness of data mining model. Data mining is very much data-oriented, so data mining models tend to be very sensitive to changes of data or situations. Thus, it is evident that this non-robustness of data mining model requires continuous remodeling as data or situation changes. We hope that this paper suggest valuable guideline to organizations and companies that consider introducing or constructing a fraud detection model in the future.

An Integrated Model based on Genetic Algorithms for Implementing Cost-Effective Intelligent Intrusion Detection Systems (비용효율적 지능형 침입탐지시스템 구현을 위한 유전자 알고리즘 기반 통합 모형)

  • Lee, Hyeon-Uk;Kim, Ji-Hun;Ahn, Hyun-Chul
    • Journal of Intelligence and Information Systems
    • /
    • v.18 no.1
    • /
    • pp.125-141
    • /
    • 2012
  • These days, the malicious attacks and hacks on the networked systems are dramatically increasing, and the patterns of them are changing rapidly. Consequently, it becomes more important to appropriately handle these malicious attacks and hacks, and there exist sufficient interests and demand in effective network security systems just like intrusion detection systems. Intrusion detection systems are the network security systems for detecting, identifying and responding to unauthorized or abnormal activities appropriately. Conventional intrusion detection systems have generally been designed using the experts' implicit knowledge on the network intrusions or the hackers' abnormal behaviors. However, they cannot handle new or unknown patterns of the network attacks, although they perform very well under the normal situation. As a result, recent studies on intrusion detection systems use artificial intelligence techniques, which can proactively respond to the unknown threats. For a long time, researchers have adopted and tested various kinds of artificial intelligence techniques such as artificial neural networks, decision trees, and support vector machines to detect intrusions on the network. However, most of them have just applied these techniques singularly, even though combining the techniques may lead to better detection. With this reason, we propose a new integrated model for intrusion detection. Our model is designed to combine prediction results of four different binary classification models-logistic regression (LOGIT), decision trees (DT), artificial neural networks (ANN), and support vector machines (SVM), which may be complementary to each other. As a tool for finding optimal combining weights, genetic algorithms (GA) are used. Our proposed model is designed to be built in two steps. At the first step, the optimal integration model whose prediction error (i.e. erroneous classification rate) is the least is generated. After that, in the second step, it explores the optimal classification threshold for determining intrusions, which minimizes the total misclassification cost. To calculate the total misclassification cost of intrusion detection system, we need to understand its asymmetric error cost scheme. Generally, there are two common forms of errors in intrusion detection. The first error type is the False-Positive Error (FPE). In the case of FPE, the wrong judgment on it may result in the unnecessary fixation. The second error type is the False-Negative Error (FNE) that mainly misjudges the malware of the program as normal. Compared to FPE, FNE is more fatal. Thus, total misclassification cost is more affected by FNE rather than FPE. To validate the practical applicability of our model, we applied it to the real-world dataset for network intrusion detection. The experimental dataset was collected from the IDS sensor of an official institution in Korea from January to June 2010. We collected 15,000 log data in total, and selected 10,000 samples from them by using random sampling method. Also, we compared the results from our model with the results from single techniques to confirm the superiority of the proposed model. LOGIT and DT was experimented using PASW Statistics v18.0, and ANN was experimented using Neuroshell R4.0. For SVM, LIBSVM v2.90-a freeware for training SVM classifier-was used. Empirical results showed that our proposed model based on GA outperformed all the other comparative models in detecting network intrusions from the accuracy perspective. They also showed that the proposed model outperformed all the other comparative models in the total misclassification cost perspective. Consequently, it is expected that our study may contribute to build cost-effective intelligent intrusion detection systems.

Design and Implementation of Web Based Instruction Based on Constructivism for Self-Directed Learning Ablity (구성주의 이론에 기반한 자기주도적 웹 기반 교육의 설계와 구현)

  • Kim Gi-Nam;Kim Eui-Jeong;Kim Chang-Suk
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2006.05a
    • /
    • pp.855-858
    • /
    • 2006
  • First of all, Developing information technology makes it possible to change a paradigm of all kinds of areas, including an education. Students can choose learning goals and objects themselves and acquire not the accumulation of knowledge but the method of their learning. Moreover, Teachers get to be adviser, and students play a key role in teaming. That is, the subject of leaning is students. Constructivism emphasizes the student-oriented environment of education, which corresponds to the characteristics of hypeimedia. In addition, Internet allows us to make a practical plan for constructivism. Web Based Internet provides us with a proper environment to make constructivism practice md causes an education system to change. Sure Web Based Instruction makes them motivated to learn more, they can gain plenty of information regardless of places or time. Besides, they are able to consult more up-to-date information regarding their learning use hypermedia such as an image, audio, video, and test, and effectively communicate with their instructor through a board, an e-mail, a chatting etc. A school and instructors have been making effort to develop a new model of a teaching method to cope with a new environment change. In this thesis, with 'Design and Implementation of Web Based Instruction Based on Constructivism', providing online learner-oriented and indexed video lesson, learners can get chance of self-oriented learning. In addition, learners doesn't have to cover all contents of a lesson but can choose contents they want to have from a indexed list of a lesson, and they ran search contents they want to have with a 'Keyword Search' on a main page, which can make learners improve learner's achievement.

  • PDF

Modern Paper Quality Control

  • Olavi Komppa
    • Proceedings of the Korea Technical Association of the Pulp and Paper Industry Conference
    • /
    • 2000.06a
    • /
    • pp.16-23
    • /
    • 2000
  • The increasing functional needs of top-quality printing papers and packaging paperboards, and especially the rapid developments in electronic printing processes and various computer printers during past few years, set new targets and requirements for modern paper quality. Most of these paper grades of today have relatively high filler content, are moderately or heavily calendered , and have many coating layers for the best appearance and performance. In practice, this means that many of the traditional quality assurance methods, mostly designed to measure papers made of pure. native pulp only, can not reliably (or at all) be used to analyze or rank the quality of modern papers. Hence, introduction of new measurement techniques is necessary to assure and further develop the paper quality today and in the future. Paper formation , i.e. small scale (millimeter scale) variation of basis weight, is the most important quality parameter of paper-making due to its influence on practically all the other quality properties of paper. The ideal paper would be completely uniform so that the basis weight of each small point (area) measured would be the same. In practice, of course, this is not possible because there always exists relatively large local variations in paper. However, these small scale basis weight variations are the major reason for many other quality problems, including calender blacking uneven coating result, uneven printing result, etc. The traditionally used visual inspection or optical measurement of the paper does not give us a reliable understanding of the material variations in the paper because in modern paper making process the optical behavior of paper is strongly affected by using e.g. fillers, dye or coating colors. Futhermore, the opacity (optical density) of the paper is changed at different process stages like wet pressing and calendering. The greatest advantage of using beta transmission method to measure paper formation is that it can be very reliably calibrated to measure true basis weight variation of all kinds of paper and board, independently on sample basis weight or paper grade. This gives us the possibility to measure, compare and judge papers made of different raw materials, different color, or even to measure heavily calendered, coated or printed papers. Scientific research of paper physics has shown that the orientation of the top layer (paper surface) fibers of the sheet paly the key role in paper curling and cockling , causing the typical practical problems (paper jam) with modern fax and copy machines, electronic printing , etc. On the other hand, the fiber orientation at the surface and middle layer of the sheet controls the bending stiffness of paperboard . Therefore, a reliable measurement of paper surface fiber orientation gives us a magnificent tool to investigate and predict paper curling and coclking tendency, and provides the necessary information to finetune, the manufacturing process for optimum quality. many papers, especially heavily calendered and coated grades, do resist liquid and gas penetration very much, bing beyond the measurement range of the traditional instruments or resulting invonveniently long measuring time per sample . The increased surface hardness and use of filler minerals and mechanical pulp make a reliable, nonleaking sample contact to the measurement head a challenge of its own. Paper surface coating causes, as expected, a layer which has completely different permeability characteristics compared to the other layer of the sheet. The latest developments in sensor technologies have made it possible to reliably measure gas flow in well controlled conditions, allowing us to investigate the gas penetration of open structures, such as cigarette paper, tissue or sack paper, and in the low permeability range analyze even fully greaseproof papers, silicon papers, heavily coated papers and boards or even detect defects in barrier coatings ! Even nitrogen or helium may be used as the gas, giving us completely new possibilities to rank the products or to find correlation to critical process or converting parameters. All the modern paper machines include many on-line measuring instruments which are used to give the necessary information for automatic process control systems. hence, the reliability of this information obtained from different sensors is vital for good optimizing and process stability. If any of these on-line sensors do not operate perfectly ass planned (having even small measurement error or malfunction ), the process control will set the machine to operate away from the optimum , resulting loss of profit or eventual problems in quality or runnability. To assure optimum operation of the paper machines, a novel quality assurance policy for the on-line measurements has been developed, including control procedures utilizing traceable, accredited standards for the best reliability and performance.