• Title/Summary/Keyword: Software Process Level and Process Performance

Search Result 100, Processing Time 0.026 seconds

Counterfeit Money Detection Algorithm using Non-Local Mean Value and Support Vector Machine Classifier (비지역적 특징값과 서포트 벡터 머신 분류기를 이용한 위변조 지폐 판별 알고리즘)

  • Ji, Sang-Keun;Lee, Hae-Yeoun
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.2 no.1
    • /
    • pp.55-64
    • /
    • 2013
  • Due to the popularization of digital high-performance capturing equipments and the emergence of powerful image-editing softwares, it is easy for anyone to make a high-quality counterfeit money. However, the probability of detecting a counterfeit money to the general public is extremely low. In this paper, we propose a counterfeit money detection algorithm using a general purpose scanner. This algorithm determines counterfeit money based on the different features in the printing process. After the non-local mean value is used to analyze the noises from each money, we extract statistical features from these noises by calculating a gray level co-occurrence matrix. Then, these features are applied to train and test the support vector machine classifier for identifying either original or counterfeit money. In the experiment, we use total 324 images of original money and counterfeit money. Also, we compare with noise features from previous researches using wiener filter and discrete wavelet transform. The accuracy of the algorithm for identifying counterfeit money was over 94%. Also, the accuracy for identifying the printing source was over 93%. The presented algorithm performs better than previous researches.

Design and Implementation of An I/O System for Irregular Application under Parallel System Environments (병렬 시스템 환경하에서 비정형 응용 프로그램을 위한 입출력 시스템의 설계 및 구현)

  • No, Jae-Chun;Park, Seong-Sun;;Gwon, O-Yeong
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.26 no.11
    • /
    • pp.1318-1332
    • /
    • 1999
  • 본 논문에서는 입출력 응용을 위해 collective I/O 기법을 기반으로 한 실행시간 시스템의 설계, 구현 그리고 그 성능평가를 기술한다. 여기서는 모든 프로세서가 동시에 I/O 요구에 따라 스케쥴링하며 I/O를 수행하는 collective I/O 방안과 프로세서들이 여러 그룹으로 묶이어, 다음 그룹이 데이터를 재배열하는 통신을 수행하는 동안 오직 한 그룹만이 동시에 I/O를 수행하는 pipelined collective I/O 등의 두 가지 설계방안을 살펴본다. Pipelined collective I/O의 전체 과정은 I/O 노드 충돌을 동적으로 줄이기 위해 파이프라인된다. 이상의 설계 부분에서는 동적으로 충돌 관리를 위한 지원을 제공한다. 본 논문에서는 다른 노드의 메모리 영역에 이미 존재하는 데이터를 재 사용하여 I/O 비용을 줄이기 위해 collective I/O 방안에서의 소프트웨어 캐슁 방안과 두 가지 모형에서의 chunking과 온라인 압축방안을 기술한다. 그리고 이상에서 기술한 방안들이 입출력을 위해 높은 성능을 보임을 기술하는데, 이 성능결과는 Intel Paragon과 ASCI/Red teraflops 기계 상에서 실험한 것이다. 그 결과 응용 레벨에서의 bandwidth는 peak point가 55%까지 측정되었다.Abstract In this paper we present the design, implementation and evaluation of a runtime system based on collective I/O techniques for irregular applications. We present two designs, namely, "Collective I/O" and "Pipelined Collective I/O". In the first scheme, all processors participate in the I/O simultaneously, making scheduling of I/O requests simpler but creating a possibility of contention at the I/O nodes. In the second approach, processors are grouped into several groups, so that only one group performs I/O simultaneously, while the next group performs communication to rearrange data, and this entire process is pipelined to reduce I/O node contention dynamically. In other words, the design provides support for dynamic contention management. Then we present a software caching method using collective I/O to reduce I/O cost by reusing data already present in the memory of other nodes. Finally, chunking and on-line compression mechanisms are included in both models. We demonstrate that we can obtain significantly high-performance for I/O above what has been possible so far. The performance results are presented on an Intel Paragon and on the ASCI/Red teraflops machine. Application level I/O bandwidth up to 55% of the peak is observed.he peak is observed.

The Empirical Study on the Effects of the Team Empowerment caused by the Team-Based Organizational Structure in KBS (팀제가 팀 임파워먼트에 미치는 영향에 관한 연구;KBS 팀제를 중심으로)

  • Ahn, Dong-Su;Kim, Hong
    • 한국벤처창업학회:학술대회논문집
    • /
    • 2006.04a
    • /
    • pp.167-201
    • /
    • 2006
  • Korean corporations are transforming their vertical operational structure to a team-based structure to compete in the rapidly changing environment and for improved performance. However, a high percentage of the respondents in KBS said that despite the appearance of the present team structure, the organization operates much like a vertically-structured organization. This result can be attributed to the lack of study and implementation toward the goal of empowerment, the key variable for the success of the team-based structure. This study aims to provide policy suggestions on how to implement the process of empowerment, by investigating the conditions that hinder the process and the attitude of the KBS employees. For the cross-sectional study, this thesis examined the domestic and international references, conducted a survey of KBS employees, personal interviews and made direct observations. Approximately 1,200 copies of the Questionnaire were distributed and 474 were completed and returned. The analysis used SPSS 12.0 software to process the data collected from 460 respondents. For the longitudinal-study, six categories that were common to this study and "The Report of the Findings of KBS Employees' View of the Team Structure" were selected. The comparative study analyzed the changes in a ten-month period. The survey findings showed a decrease of 24.2%p in the number of responses expressing negative views of the team structure and a decrease of 1.29%p in the number of positive responses. The findings indicated a positive transformation illustrating employees' improved understanding and approval of the team structure. However, KBS must address the issue on an ongoing basis. It has been proven that the employee empowerment increases the productivity of the individual and the group. In order to boost the level of empowerment, the management must exercise new, innovative leadership and build trust between the managers and the employees first. Additional workload as a result of shirking at work places was prevalent throughout all divisions and ranks, according to the survey data. This outcome leads to the conclusion that the workload is not evenly distributed or shared. And the data also showed the employees do not trust the assessment and rewards system. More attention and consideration must be paid to the team size and job allocation in order to address this matter; the present assessment and rewards system need to be complemented. The type of leadership varies depending on the characteristics of the organization's structure and employees' disposition. KBS must develop and reform its own management, leadership style to suit the characteristics of individual teams. Finally, for a soft-landing of KBS team structure, in-house training and education are necessary.

  • PDF

Exploring Influence of Network Structure, Organizational Learning Culture, and Knowledge Management Participation on Individual Creativity and Performance: Comparison of SI Proposal Team and R&D Team (네트워크 구조와 조직학습문화, 지식경영참여가 개인창의성 및 성과에 미치는 영향에 관한 실증분석: SI제안팀과 R&D팀의 비교연구)

  • Lee, Kun-Chang;Seo, Young-Wook;Chae, Seong-Wook;Song, Seok-Woo
    • Asia pacific journal of information systems
    • /
    • v.20 no.4
    • /
    • pp.101-123
    • /
    • 2010
  • Recently, firms are operating a number of teams to accomplish organizational performance. Especially, ad hoc teams like proposal preparation team are quite different from permanent teams like R&D team in the sense of how the team forms network structure and deals with organizational learning culture and knowledge management participation efforts. Moreover, depending on the team characteristics, individual creativity will differ from each other, which will lead to organizational performance eventually. Previous studies in the field of creativity are lacking in this issue. So main objectives of this study are organized as follows. First, the issue of how to improve individual creativity and organizational performance will be analyzed empirically. This issue will be performed depending on team characteristics such as ad hoc team and permanent team. Antecedents adopted for this research objective are cultural and knowledge factors such as organizational learning culture, and knowledge management participation. Second, the network structure such as degree centrality, and structural hole is used to analyze its influence on individual creativity and organizational performance. SI (System Integration) companies are facing severely tough requirements from clients to submit very creative proposals. Also, R&D teams are widely accepted as relatively creative teams because their responsibilities are focused on suggesting innovative techniques to make their companies remain competitive in the market. SI teams are usually ad hoc, while R&D teams are permanent on an average. By taking advantage of these characteristics of the two kinds of teams, we will prove the validity of the proposed research questions. To obtain the survey data, we accessed 7 SI teams (74 members), and 6 R&D teams (63 members), collecting 137 valid questionnaires. PLS technique was applied to analyze the survey data. Results are as follows. First, in case of SI teams, organizational learning culture affects individual creativity significantly. Meanwhile, knowledge management participation has a significant influence on Individual creativity for the permanent teams. Second, degree centrality Influences individual creativity significantly in case of SI teams. This is comparable with the fact that structural hole has a significant impact on individual creativity for the R&D teams. Practical implications can be summarized as follows: First, network structure of ad hoc team should be designed differently from one of permanent team. Ad hoc team is supposed to show a high creativity in a rather short period, implying that network density among team members should be improved, and those members with high degree centrality should be encouraged to show their Individual creativity and take a leading role by allowing them to get heavily engaged in knowledge sharing and diffusion. In contrast, permanent team should be designed to take advantage of structural hole instead of focusing on network density. Since structural hole can be utilized very effectively in the permanent team, strong arbitrators' merits in the permanent team will increase and therefore helps increase both network efficiency and effectiveness too. In this way, individual creativity in the permanent team is likely to lead to organizational creativity in a seamless way. Second, way of Increasing individual creativity should be sought from the perspective of organizational culture and knowledge management. Organization is supposed to provide a cultural atmosphere in which Innovative idea suggestions and active discussion among team members are encouraged. In this way, trust builds up among team members, facilitating the formation of organizational learning culture. Third, in the ad hoc team, organizational looming culture should be built such a way that individual creativity can grow up fast in a rather short period. Since time is tight, reasonable compensation policy, leader's Initiatives, and learning culture formation should be done In a short period so that mutual trust is built among members quickly, and necessary knowledge and information can be learnt rapidly. Fourth, in the permanent team, it should be kept in mind that the degree of participation in knowledge management determines level of Individual creativity. Therefore, the team ought to facilitate knowledge circulation process such as knowledge creation, storage, sharing, utilization, and learning among team members, which will lead to team performance. In this way, firms must control knowledge networks in permanent team and ad hoc team in a way mentioned above so that individual creativity as well as team performance can be maximized.

Development of a Real-time OS Based Control System for Laparoscopic Surgery Robot (복강경 수술로봇을 위한 실시간 운영체제 기반 제어 시스템의 개발)

  • Song, Seung-Joon;Park, Jun-Woo;Shin, Jung-Wook;Kim, Yun-Ho;Lee, Duk-Hee;Jo, Yung-Ho;Choi, Jae-Seoon;Sun, Kyung
    • Journal of Biomedical Engineering Research
    • /
    • v.29 no.1
    • /
    • pp.32-39
    • /
    • 2008
  • This paper reports on a realtime OS based master-slave configuration robot control system for laparoscopic surgery robot which enables telesurgery and overcomes shortcomings with conventional laparoscopic surgery. Surgery robot system requires control system that can process large volume information such as medical image data and video signal from endoscope in real-time manner, as well as precisely control the robot with high reliability. To meet the complex requirements, the use of high-level real-time OS (Operating System) in surgery robot controller is a must, which is as common as in many of modem robot controllers that adopt real-time OS as a base system software on which specific functional modules are implemened for more reliable and stable system. The control system consists of joint controllers, host controllers, and user interface units. The robot features a compact slave robot with 5 DOF (Degree-Of-Freedom) expanding the workspace of each tool and increasing the number of tools operating simultaneously. Each master, slave and Gill (Graphical User Interface) host runs a dedicated RTOS (Real-time OS), RTLinux-Pro (FSMLabs Inc., U.S.A.) on which functional modules such as motion control, communication, video signal integration and etc, are implemented, and all the hosts are in a gigabit Ethernet network for inter-host communication. Each master and slave controller set has a dedicated CAN (Controller Area Network) channel for control and monitoring signal communication with the joint controllers. Total 4 pairs of the master/slave manipulators as current are controlled by one host controller. The system showed satisfactory performance in both position control precision and master-slave motion synchronization in both bench test and animal experiment, and is now under further development for better safety and control fidelity for clinically applicable prototype.

Building Energy Savings due to Incorporated Daylight-Glazing Systems (통합 채광시스템의 건물 냉난방 에너지 성능평가)

  • Kim, Jeong-Tai;Ahn, Hyun-Tae;Kim, Gon
    • Journal of the Korean Institute of Illuminating and Electrical Installation Engineers
    • /
    • v.19 no.6
    • /
    • pp.1-8
    • /
    • 2005
  • The quantity of light available for a space can be translated in term of the amount of energy savings through a process of a building energy simulation. To get significant energy savings in general illumination, the electric lighting system must be incorporated with a daylight - activated dimmer control. A prototype configuration of an once interior has been established and the integration between the building envelope and lighting and HVAC systems is evaluated based on computer modeling of a lighting control facility. First of all, an energy-efficient luminaire system is designed and the lighting analysis program, Lumen-Micro 2000 predicts the optimal layout of a conventional fluorescent lighting future to meet the designed lighting level and calculates unit power density, which translates the demanded met of electric lighting energy. A dimming control system integrated with the contribution of daylighting has been applied to the operating of the artificial lighting. Annual cooling load due to lighting and the projecting saving amount of cooling load due to daylighting under overcast diffuse sky m evaluated by computer software ENER-Win. In brief, the results from building energy simulation with measured daylight illumination levels and the performance of lighting control system indicate that daylighting can save over 70 percent of the required energy for general illumination in the perimeter zones through the year A 25[%] of electric energy for cooling and almost all off heating energy may be saved by dimming and turning off the luminaires in the perimeter zones.

Impact Assessment of Climate Change by Using Cloud Computing (클라우드 컴퓨팅을 이용한 기후변화 영향평가)

  • Kim, Kwang-S.
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.13 no.2
    • /
    • pp.101-108
    • /
    • 2011
  • Climate change could have a pronounced impact on natural and agricultural ecosystems. To assess the impact of climate change, projected climate data have been used as inputs to models. Because such studies are conducted occasionally, it would be useful to employ Cloud computing, which provides multiple instances of operating systems in a virtual environment to do processing on demand without building or maintaining physical computing resources. Furthermore, it would be advantageous to use open source geospatial applications in order to avoid the limitations of proprietary software when Cloud computing is used. As a pilot study, Amazon Web Service ? Elastic Compute Cloud (EC2) was used to calculate the number of days with rain in a given month. Daily sets of climate projection data, which were about 70 gigabytes in total, were processed using virtual machines with a customized database transaction application. The application was linked against open source libraries for the climate data and database access. In this approach, it took about 32 hours to process 17 billion rows of record in order to calculate the rain day on a global scale over the next 100 years using ten clients and one server instances. Here I demonstrate that Cloud computing could provide the high level of performance for impact assessment studies of climate change that require considerable amount of data.

Prioritizing of Civil-BIM DB Construction based on Geo-Spatial Information System (지형공간정보체계 기반의 토목-BIM DB구축 우선순위 선정)

  • Park, Dong Hyun;Kang, In Joon;Jang, Yong Gu;Lee, Byung Gul
    • Journal of Korean Society for Geospatial Information Science
    • /
    • v.23 no.1
    • /
    • pp.73-79
    • /
    • 2015
  • Recently, BIM proliferates at high speeds so that BIM planning is trying to be used in various kinds of engineering. However, when using different software during design phase and construction phase, the problem of mutual compatibility is coming out. Even the BIM technology has been used or the practical applicability has been made into result, it would be fair to say that BIM has limitations in the visual level. In this research, it is meaningless to obtain the BIM result as the primary purpose. As the usefulness of it is judged incomplete, we committed to master the trend and problems of terrain spatial information systems and BIM. Furthermore, the plan of building the BIM in the civil field, especially the civil-BIM based on the technology ofterrain spatial information has been presented. It can be judged through this research that the high-capacity DB of BIM occurred during the whole process may cause poor performance of the following stage the structure system which connects the terrain spatial information and civil-BIM. In order to manage the optimal full cycle, the spatial analysis technology of the stages after choosing the DB has been described.

Information Privacy Concern in Context-Aware Personalized Services: Results of a Delphi Study

  • Lee, Yon-Nim;Kwon, Oh-Byung
    • Asia pacific journal of information systems
    • /
    • v.20 no.2
    • /
    • pp.63-86
    • /
    • 2010
  • Personalized services directly and indirectly acquire personal data, in part, to provide customers with higher-value services that are specifically context-relevant (such as place and time). Information technologies continue to mature and develop, providing greatly improved performance. Sensory networks and intelligent software can now obtain context data, and that is the cornerstone for providing personalized, context-specific services. Yet, the danger of overflowing personal information is increasing because the data retrieved by the sensors usually contains privacy information. Various technical characteristics of context-aware applications have more troubling implications for information privacy. In parallel with increasing use of context for service personalization, information privacy concerns have also increased such as an unrestricted availability of context information. Those privacy concerns are consistently regarded as a critical issue facing context-aware personalized service success. The entire field of information privacy is growing as an important area of research, with many new definitions and terminologies, because of a need for a better understanding of information privacy concepts. Especially, it requires that the factors of information privacy should be revised according to the characteristics of new technologies. However, previous information privacy factors of context-aware applications have at least two shortcomings. First, there has been little overview of the technology characteristics of context-aware computing. Existing studies have only focused on a small subset of the technical characteristics of context-aware computing. Therefore, there has not been a mutually exclusive set of factors that uniquely and completely describe information privacy on context-aware applications. Second, user survey has been widely used to identify factors of information privacy in most studies despite the limitation of users' knowledge and experiences about context-aware computing technology. To date, since context-aware services have not been widely deployed on a commercial scale yet, only very few people have prior experiences with context-aware personalized services. It is difficult to build users' knowledge about context-aware technology even by increasing their understanding in various ways: scenarios, pictures, flash animation, etc. Nevertheless, conducting a survey, assuming that the participants have sufficient experience or understanding about the technologies shown in the survey, may not be absolutely valid. Moreover, some surveys are based solely on simplifying and hence unrealistic assumptions (e.g., they only consider location information as a context data). A better understanding of information privacy concern in context-aware personalized services is highly needed. Hence, the purpose of this paper is to identify a generic set of factors for elemental information privacy concern in context-aware personalized services and to develop a rank-order list of information privacy concern factors. We consider overall technology characteristics to establish a mutually exclusive set of factors. A Delphi survey, a rigorous data collection method, was deployed to obtain a reliable opinion from the experts and to produce a rank-order list. It, therefore, lends itself well to obtaining a set of universal factors of information privacy concern and its priority. An international panel of researchers and practitioners who have the expertise in privacy and context-aware system fields were involved in our research. Delphi rounds formatting will faithfully follow the procedure for the Delphi study proposed by Okoli and Pawlowski. This will involve three general rounds: (1) brainstorming for important factors; (2) narrowing down the original list to the most important ones; and (3) ranking the list of important factors. For this round only, experts were treated as individuals, not panels. Adapted from Okoli and Pawlowski, we outlined the process of administrating the study. We performed three rounds. In the first and second rounds of the Delphi questionnaire, we gathered a set of exclusive factors for information privacy concern in context-aware personalized services. The respondents were asked to provide at least five main factors for the most appropriate understanding of the information privacy concern in the first round. To do so, some of the main factors found in the literature were presented to the participants. The second round of the questionnaire discussed the main factor provided in the first round, fleshed out with relevant sub-factors. Respondents were then requested to evaluate each sub factor's suitability against the corresponding main factors to determine the final sub-factors from the candidate factors. The sub-factors were found from the literature survey. Final factors selected by over 50% of experts. In the third round, a list of factors with corresponding questions was provided, and the respondents were requested to assess the importance of each main factor and its corresponding sub factors. Finally, we calculated the mean rank of each item to make a final result. While analyzing the data, we focused on group consensus rather than individual insistence. To do so, a concordance analysis, which measures the consistency of the experts' responses over successive rounds of the Delphi, was adopted during the survey process. As a result, experts reported that context data collection and high identifiable level of identical data are the most important factor in the main factors and sub factors, respectively. Additional important sub-factors included diverse types of context data collected, tracking and recording functionalities, and embedded and disappeared sensor devices. The average score of each factor is very useful for future context-aware personalized service development in the view of the information privacy. The final factors have the following differences comparing to those proposed in other studies. First, the concern factors differ from existing studies, which are based on privacy issues that may occur during the lifecycle of acquired user information. However, our study helped to clarify these sometimes vague issues by determining which privacy concern issues are viable based on specific technical characteristics in context-aware personalized services. Since a context-aware service differs in its technical characteristics compared to other services, we selected specific characteristics that had a higher potential to increase user's privacy concerns. Secondly, this study considered privacy issues in terms of service delivery and display that were almost overlooked in existing studies by introducing IPOS as the factor division. Lastly, in each factor, it correlated the level of importance with professionals' opinions as to what extent users have privacy concerns. The reason that it did not select the traditional method questionnaire at that time is that context-aware personalized service considered the absolute lack in understanding and experience of users with new technology. For understanding users' privacy concerns, professionals in the Delphi questionnaire process selected context data collection, tracking and recording, and sensory network as the most important factors among technological characteristics of context-aware personalized services. In the creation of a context-aware personalized services, this study demonstrates the importance and relevance of determining an optimal methodology, and which technologies and in what sequence are needed, to acquire what types of users' context information. Most studies focus on which services and systems should be provided and developed by utilizing context information on the supposition, along with the development of context-aware technology. However, the results in this study show that, in terms of users' privacy, it is necessary to pay greater attention to the activities that acquire context information. To inspect the results in the evaluation of sub factor, additional studies would be necessary for approaches on reducing users' privacy concerns toward technological characteristics such as highly identifiable level of identical data, diverse types of context data collected, tracking and recording functionality, embedded and disappearing sensor devices. The factor ranked the next highest level of importance after input is a context-aware service delivery that is related to output. The results show that delivery and display showing services to users in a context-aware personalized services toward the anywhere-anytime-any device concept have been regarded as even more important than in previous computing environment. Considering the concern factors to develop context aware personalized services will help to increase service success rate and hopefully user acceptance for those services. Our future work will be to adopt these factors for qualifying context aware service development projects such as u-city development projects in terms of service quality and hence user acceptance.

RGB Channel Selection Technique for Efficient Image Segmentation (효율적인 이미지 분할을 위한 RGB 채널 선택 기법)

  • 김현종;박영배
    • Journal of KIISE:Software and Applications
    • /
    • v.31 no.10
    • /
    • pp.1332-1344
    • /
    • 2004
  • Upon development of information super-highway and multimedia-related technoiogies in recent years, more efficient technologies to transmit, store and retrieve the multimedia data are required. Among such technologies, firstly, it is common that the semantic-based image retrieval is annotated separately in order to give certain meanings to the image data and the low-level property information that include information about color, texture, and shape Despite the fact that the semantic-based information retrieval has been made by utilizing such vocabulary dictionary as the key words that given, however it brings about a problem that has not yet freed from the limit of the existing keyword-based text information retrieval. The second problem is that it reveals a decreased retrieval performance in the content-based image retrieval system, and is difficult to separate the object from the image that has complex background, and also is difficult to extract an area due to excessive division of those regions. Further, it is difficult to separate the objects from the image that possesses multiple objects in complex scene. To solve the problems, in this paper, I established a content-based retrieval system that can be processed in 5 different steps. The most critical process of those 5 steps is that among RGB images, the one that has the largest and the smallest background are to be extracted. Particularly. I propose the method that extracts the subject as well as the background by using an Image, which has the largest background. Also, to solve the second problem, I propose the method in which multiple objects are separated using RGB channel selection techniques having optimized the excessive division of area by utilizing Watermerge's threshold value with the object separation using the method of RGB channels separation. The tests proved that the methods proposed by me were superior to the existing methods in terms of retrieval performances insomuch as to replace those methods that developed for the purpose of retrieving those complex objects that used to be difficult to retrieve up until now.