• 제목/요약/키워드: quality of software

Search Result 2,812, Processing Time 0.03 seconds

Characteristics and Implications of Marseille's Euromméditerranée as an Integrated Urban Regeneration Project (통합형 도시재생사업으로서 마르세유 유로메디테라네의 특성과 시사점)

  • Wonseok Park
    • Land and Housing Review
    • /
    • v.15 no.1
    • /
    • pp.99-115
    • /
    • 2024
  • This study aims to investigate Marseille's Euromméditerranée project and provide policy implications for revitalizing domestic urban regeneration projects. First, we identify Euroméditerranée as a pivotal urban regeneration effort, executed by EPAEM-an organization fostering governance-driven project advancement through collaboration and investment from both central and local governments. This endeavor has significantly contributed to revitalizing Marseille, enriching the quality of life for its residents. Second, this urban regeneration project has the following notable features: consolidated approach with combination of full redevelopment and rehabilitation, integrated regeneration covering hardware-like physical regeneration and software-like economic, cultural, and environmental regeneration; government-type urban regeneration project structure. Finally, we suggest that policymakers should consider the economic scale in urban regeneration projects, national-level government organizations, and efficient public-private partnerships.

Usability index evaluation system for mobile WAP service (무선인터넷 서비스 사용성 지수 평가 체계)

  • Park, Hwan-Su
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02b
    • /
    • pp.152-157
    • /
    • 2008
  • The customer satisfaction of WAP service greatly relies on the usability of the service due to the limited display size of a mobile phone and limitation in realizing UI (User Interface) for function keys, browser, and OS (operating system) Currently, a number of contents providers develop and deliver varying services, and thus, it is critical to control quality level of UI in consistent standards and manner. This study suggests usability index evaluation system to achieve consistent UI quality control of various WAP services. The system adopts both top-down and bottom-up approaches. The former concerns deriving UI design components and evaluation checklists for the WAP, based on the usability attributes and UI principles. The latter concerns deriving usability-related evaluation checklists from the established UI design features, and then grouping them from the viewpoint of usability principles and attributes. This bidirectional approach has two outstanding advantages: it allows thorough examination of potential elements that can cause usability problems from the standpoint of usability attributes, and also derives specific evaluation elements from the perspective of UI design components that are relevant to the real service environment. The evaluation system constitutes a hierarchical structure by networking usability attributes, UI guideline which indicates usability principles for each attribute, and usability evaluation checklist for each UI component that enables specific evaluation. Especially, each evaluation checklist contains concrete contents and format so that it can be readily marked in O/X. The score is based on the ratio of number of items that received positive answer to the number of total items. This enables a quantitative evaluation of the usability of mobile WAP service. The validity of the proposed evaluation system has been proved through comparative analysis with the real usability problems based on the user test. A software was developed that provides guideline for evaluation objects, criteria and examples for each checklist, and automatically calculates a score. The software was applied to evaluating and improving the real mobile WAP service.

  • PDF

CIA-Level Driven Secure SDLC Framework for Integrating Security into SDLC Process (CIA-Level 기반 보안내재화 개발 프레임워크)

  • Kang, Sooyoung;Kim, Seungjoo
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.30 no.5
    • /
    • pp.909-928
    • /
    • 2020
  • From the early 1970s, the US government began to recognize that penetration testing could not assure the security quality of products. Results of penetration testing such as identified vulnerabilities and faults can be varied depending on the capabilities of the team. In other words none of penetration team can assure that "vulnerabilities are not found" is not equal to "product does not have any vulnerabilities". So the U.S. government realized that in order to improve the security quality of products, the development process itself should be managed systematically and strictly. Therefore, the US government began to publish various standards related to the development methodology and evaluation procurement system embedding "security-by-design" concept from the 1980s. Security-by-design means reducing product's complexity by considering security from the initial phase of development lifecycle such as the product requirements analysis and design phase to achieve trustworthiness of product ultimately. Since then, the security-by-design concept has been spread to the private sector since 2002 in the name of Secure SDLC by Microsoft and IBM, and is currently being used in various fields such as automotive and advanced weapon systems. However, the problem is that it is not easy to implement in the actual field because the standard or guidelines related to Secure SDLC contain only abstract and declarative contents. Therefore, in this paper, we present the new framework in order to specify the level of Secure SDLC desired by enterprises. Our proposed CIA (functional Correctness, safety Integrity, security Assurance)-level-based security-by-design framework combines the evidence-based security approach with the existing Secure SDLC. Using our methodology, first we can quantitatively show gap of Secure SDLC process level between competitor and the company. Second, it is very useful when you want to build Secure SDLC in the actual field because you can easily derive detailed activities and documents to build the desired level of Secure SDLC.

The research regarding an information system risk management process characteristics (정보시스템 위험관리 프로세스 특성에 관한 연구)

  • Kim, Tai-Dal;Lee, Hyung-Won
    • The KIPS Transactions:PartD
    • /
    • v.14D no.3 s.113
    • /
    • pp.303-310
    • /
    • 2007
  • Information system failure is various such as program test unpreparedness, physical facilities for damage prevention unpreparedness from simple software error. Although cross is trifling the result causes vast damage. Recently, became difficult by simple outside security system to solve this problem. Now, synthetic countermove establishment and suitable confrontation connected with danger came in necessary visual point about general Information Technology of enterprise. In connection with, in this paper, various informations and system and control about data that can happen information inside and outside considering integrity for IT resource, solubility, confidentiality within organization studied about special quality to model synthetic Risk Management System that can of course and cope in danger.

Development of Industrial Embedded System Platform (산업용 임베디드 시스템 플랫폼 개발)

  • Kim, Dae-Nam;Kim, Kyo-Sun
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.47 no.5
    • /
    • pp.50-60
    • /
    • 2010
  • For the last half a century, the personal computer and software industries have been prosperous due to the incessant evolution of computer systems. In the 21st century, the embedded system market has greatly increased as the market shifted to the mobile gadget field. While a lot of multimedia gadgets such as mobile phone, navigation system, PMP, etc. are pouring into the market, most industrial control systems still rely on 8-bit micro-controllers and simple application software techniques. Unfortunately, the technological barrier which requires additional investment and higher quality manpower to overcome, and the business risks which come from the uncertainty of the market growth and the competitiveness of the resulting products have prevented the companies in the industry from taking advantage of such fancy technologies. However, high performance, low-power and low-cost hardware and software platforms will enable their high-technology products to be developed and recognized by potential clients in the future. This paper presents such a platform for industrial embedded systems. The platform was designed based on Telechips TCC8300 multimedia processor which embedded a variety of parallel hardware for the implementation of multimedia functions. And open-source Embedded Linux, TinyX and GTK+ are used for implementation of GUI to minimize technology costs. In order to estimate the expected performance and power consumption, the performance improvement and the power consumption due to each of enabled hardware sub-systems including YUV2RGB frame converter are measured. An analytic model was devised to check the feasibility of a new application and trade off its performance and power consumption. The validity of the model has been confirmed by implementing a real target system. The cost can be further mitigated by using the hardware parts which are being used for mass production products mostly in the cell-phone market.

Power Conscious Disk Scheduling for Multimedia Data Retrieval (저전력 환경에서 멀티미디어 자료 재생을 위한 디스크 스케줄링 기법)

  • Choi, Jung-Wan;Won, Yoo-Jip;Jung, Won-Min
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.33 no.4
    • /
    • pp.242-255
    • /
    • 2006
  • In the recent years, Popularization of mobile devices such as Smart Phones, PDAs and MP3 Players causes rapid increasing necessity of Power management technology because it is most essential factor of mobile devices. On the other hand, despite low price, hard disk has large capacity and high speed. Even it can be made small enough today, too. So it appropriates mobile devices. but it consumes too much power to embed In mobile devices. Due to these motivations, in this paper we had suggested methods of minimizing Power consumption while playing multimedia data in the disk media for real-time and we evaluated what we had suggested. Strict limitation of power consumption of mobile devices has a big impact on designing both hardware and software. One difference between real-time multimedia streaming data and legacy text based data is requirement about continuity of data supply. This fact is why disk drive must persist in active state for the entire playback duration, from power management point of view; it nay be a great burden. A legacy power management function of mobile disk drive affects quality of multimedia playback negatively because of excessive I/O requests when the disk is in standby state. Therefore, in this paper, we analyze power consumption profile of disk drive in detail, and we develop the algorithm which can play multimedia data effectively using less power. This algorithm calculates number of data block to be read and time duration of active/standby state. From this, the algorithm suggested in this paper does optimal scheduling that is ensuring continual playback of data blocks stored in mobile disk drive. And we implement our algorithms in publicly available MPEG player software. This MPEG player software saves up to 60% of power consumption as compared with full-time active stated disk drive, and 38% of power consumption by comparison with disk drive controlled by native power management method.

Development of Quality Assurance Software for $PRESAGE^{REU}$ Gel Dosimetry ($PRESAGE^{REU}$ 겔 선량계의 분석 및 정도 관리 도구 개발)

  • Cho, Woong;Lee, Jaegi;Kim, Hyun Suk;Wu, Hong-Gyun
    • Progress in Medical Physics
    • /
    • v.25 no.4
    • /
    • pp.233-241
    • /
    • 2014
  • The aim of this study is to develop a new software tool for 3D dose verification using $PRESAGE^{REU}$ Gel dosimeter. The tool included following functions: importing 3D doses from treatment planning systems (TPS), importing 3D optical density (OD), converting ODs to doses, 3D registration between two volumetric data by translational and rotational transformations, and evaluation with 3D gamma index. To acquire correlation between ODs and doses, CT images of a $PRESAGE^{REU}$ Gel with cylindrical shape was acquired, and a volumetric modulated arc therapy (VMAT) plan was designed to give radiation doses from 1 Gy to 6 Gy to six disk-shaped virtual targets along z-axis. After the VMAT plan was delivered to the targets, 3D OD data were reconstructed from 512 projection data from $Vista^{TM}$ optical CT scanner (Modus Medical Devices Inc, Canada) per every 2 hours after irradiation. A curve for converting ODs to doses was derived by comparing TPS dose profile to OD profile along z-axis, and the 3D OD data were converted to the absorbed doses using the curve. Supra-linearity was observed between doses and ODs, and the ODs were decayed about 60% per 24 hours depending on their magnitudes. Measured doses from the $PRESAGE^{REU}$ Gel were well agreed with the TPS doses at central region, but large under-doses were observed at peripheral region at the cylindrical geometry. Gamma passing rate for 3D doses was 70.36% under the gamma criteria of 3% of dose difference and 3 mm of distance to agreement. The low passing rate was resulted from the mismatching of the refractive index between the PRESAGE gel and oil bath in the optical CT scanner. In conclusion, the developed software was useful for 3D dose verification from PRESAGE gel dosimetry, but further improvement of the Gel dosimetry system were required.

Opportunity Tree Framework Design For Optimization of Software Development Project Performance (소프트웨어 개발 프로젝트 성능의 최적화를 위한 Opportunity Tree 모델 설계)

  • Song Ki-Won;Lee Kyung-Whan
    • The KIPS Transactions:PartD
    • /
    • v.12D no.3 s.99
    • /
    • pp.417-428
    • /
    • 2005
  • Today, IT organizations perform projects with vision related to marketing and financial profit. The objective of realizing the vision is to improve the project performing ability in terms of QCD. Organizations have made a lot of efforts to achieve this objective through process improvement. Large companies such as IBM, Ford, and GE have made over $80\%$ of success through business process re-engineering using information technology instead of business improvement effect by computers. It is important to collect, analyze and manage the data on performed projects to achieve the objective, but quantitative measurement is difficult as software is invisible and the effect and efficiency caused by process change are not visibly identified. Therefore, it is not easy to extract the strategy of improvement. This paper measures and analyzes the project performance, focusing on organizations' external effectiveness and internal efficiency (Qualify, Delivery, Cycle time, and Waste). Based on the measured project performance scores, an OT (Opportunity Tree) model was designed for optimizing the project performance. The process of design is as follows. First, meta data are derived from projects and analyzed by quantitative GQM(Goal-Question-Metric) questionnaire. Then, the project performance model is designed with the data obtained from the quantitative GQM questionnaire and organization's performance score for each area is calculated. The value is revised by integrating the measured scores by area vision weights from all stakeholders (CEO, middle-class managers, developer, investor, and custom). Through this, routes for improvement are presented and an optimized improvement method is suggested. Existing methods to improve software process have been highly effective in division of processes' but somewhat unsatisfactory in structural function to develop and systemically manage strategies by applying the processes to Projects. The proposed OT model provides a solution to this problem. The OT model is useful to provide an optimal improvement method in line with organization's goals and can reduce risks which may occur in the course of improving process if it is applied with proposed methods. In addition, satisfaction about the improvement strategy can be improved by obtaining input about vision weight from all stakeholders through the qualitative questionnaire and by reflecting it to the calculation. The OT is also useful to optimize the expansion of market and financial performance by controlling the ability of Quality, Delivery, Cycle time, and Waste.

Automated Functionality Test Methods for Web-based Applications (웹 기반 어플리케이션의 기능 테스트 자동화 방법)

  • Kuk, Seung-Hak;Kim, Hyeon-Soo
    • The KIPS Transactions:PartD
    • /
    • v.14D no.5
    • /
    • pp.517-530
    • /
    • 2007
  • Recently web applications have growl rapidly and have become more and more complex. As web applications become more complex, there is a growing concern about their quality. But very little attentions are paid to web applications testing and there are scarce of the practical research efforts and tools. Thus, in this paper, we suggest the automated testing methods for web applications. For this, the methods generate an analysis model by analyzing the HTML codes and the source codes. Then test targets are identified and test cases are extracted from the analysis model. In addition, test drivers and test data are generated automatically, and then they are depleted on the web server to establish a testing environment. Through this process we can automate the testing processes for web applications, besides the automated methods makes our approach more effective than the existing research efforts.

Quality Assurance of Leaf Speed for Dynamic Multileaf Collimator (MLC) Using Dynalog Files (Dynalog file을 이용한 동적다엽조준기의 Leaf 속도 정도관리 평가)

  • Kim, Joo Seob;Ahn, Woo Sang;Lee, Woo Suk;Park, Sung Ho;Choi, Wonsik;Shin, Seong Soo
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.26 no.2
    • /
    • pp.305-312
    • /
    • 2014
  • Purpose : The purpose of this study is to analyze the mechanical and leaf speed accuracy of the dynamic multileaf collimator (DMLC) and determine the appropriate period of quality assurance (QA). Materials and Methods : The quality assurance of the DMLC equipped with Millennium 120 leaves has been performed total 92 times from January 2012 to June 2014. The the accuracy of leaf position and isocenter coincidence for MLC were checked using the graph paper and Gafchromic EBT film, respectively. The stability of leaf speed was verified using a test file requiring the leaves to reach maximum leaf speed during the gantry rotation. At the end of every leaf speed QA, dynamic dynalog files created by MLC controller were analyzed using dynalog file viewer software. This file concludes the information about the planned versus actual position for all leaves and provides error RMS (root-mean square) for individual leaf deviations and error histogram for all leaf deviations. In this study, the data obtained from the leaf speed QA were used to screen the performance degradation of leaf speed and determine the need for motor replacement. Results : The leaf position accuracy and isocenteric coincidence of MLC was observed within a tolerance range recommanded from TG-142 reports. Total number of motor replacement were 56 motors over whole QA period. For all motors replaced from QA, gradually increased patterns of error RMS values were much more than suddenly increased patterns of error RMS values. Average error RMS values of gradually and suddenly increased patterns were 0.298 cm and 0.273 cm, respectively. However, The average error RMS values were within 0.35 cm recommended by the vendor, motors were replaced according to the criteria of no counts with misplacement > 1 cm. On average, motor replacement for gradually increased patterns of error RMS values 22 days. 28 motors were replaced regardless of the leaf speed QA. Conclusion : This study performed the periodic MLC QA for analyzing the mechanical and leaf speed accuracy of the dynamic multileaf collimator (DMLC). The leaf position accuracy and isocenteric coincidence showed whthin of MLC evaluation is observed within the tolerance value recommanded by TG-142 report. Based on the result obtained from leaf speed QA, we have concluded that QA protocol of leaf speed for DMLC was performed at least bimonthly in order to screen the performance of leaf speed. The periodic QA protocol can help to ensure for delivering accurate IMRT treatment to patients maintaining the performance of leaf speed.