• Title/Summary/Keyword: Software-as-a-Service

Search Result 1,356, Processing Time 0.052 seconds

A Meta-analysis on the Variables related with Recovery among Persons with Mental Illness (정신장애인의 회복관련변인에 관한 메타분석)

  • Park, Jung-Im
    • The Journal of the Korea Contents Association
    • /
    • v.18 no.12
    • /
    • pp.535-546
    • /
    • 2018
  • This study conducted a meta-analysis to examine syntagmatically on the variables related with recovery among persons with mental illness in Korea. In order to do a meta-analysis, theses and dissertations published between 1999 and 2018 in Korea were reviewed systematically and a total of 24 including studies were selected. Using Comprehensive Meta Analysis (CMA) 3.0 software, this study calculated average effect size and moderator variables related with recovery among persons with mental illness. Results were as follows. First, this study identified a total of 16 variables related with recovery among persons with mental illness. Second, the results indicated that variables which showed large effect sizes included social support(r=.575), empowerment(r=.555), self-efficacy(r=.544), social skill(r=.500), relationship with social worker(r=.482), stigma(r=-.446), family support(r=.418). Third, variables with medium effect sizes included interpersonal relationship capacity (r=.391), agency service satisfaction(r=.366), insight(r=.373) and symptom(r=-.239). Fourth, variables with small effect sizes included work experience(r=.188). Fifth, moderator analyses were conducted utilizing characteristics of residence state (community or mental hospital). Moderator effects were identified in the social support and family support. Based on the findings, theoretical and clinical implications for the recovery among persons with mental illness in Korea were discussed.

Detecting TOCTOU Race Condition on UNIX Kernel Based File System through Binary Analysis (바이너리 분석을 통한 UNIX 커널 기반 File System의 TOCTOU Race Condition 탐지)

  • Lee, SeokWon;Jin, Wen-Hui;Oh, Heekuck
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.31 no.4
    • /
    • pp.701-713
    • /
    • 2021
  • Race Condition is a vulnerability in which two or more processes input or manipulate a common resource at the same time, resulting in unintended results. This vulnerability can lead to problems such as denial of service, elevation of privilege. When a vulnerability occurs in software, the relevant information is documented, but often the cause of the vulnerability or the source code is not disclosed. In this case, analysis at the binary level is necessary to detect the vulnerability. This paper aims to detect the Time-Of-Check Time-Of-Use (TOCTOU) Race Condition vulnerability of UNIX kernel-based File System at the binary level. So far, various detection techniques of static/dynamic analysis techniques have been studied for the vulnerability. Existing vulnerability detection tools using static analysis detect through source code analysis, and there are currently few studies conducted at the binary level. In this paper, we propose a method for detecting TOCTOU Race Condition in File System based on Control Flow Graph and Call Graph through Binary Analysis Platform (BAP), a binary static analysis tool.

Performance Analysis of GPS and QZSS Orbit Determination using Pseudo Ranges and Precise Dynamic Model (의사거리 관측값과 정밀동역학모델을 이용한 GPS와 QZSS 궤도결정 성능 분석)

  • Beomsoo Kim;Jeongrae Kim;Sungchun Bu;Chulsoo Lee
    • Journal of Advanced Navigation Technology
    • /
    • v.26 no.6
    • /
    • pp.404-411
    • /
    • 2022
  • The main function in operating the satellite navigation system is to accurately determine the orbit of the navigation satellite and transmit it as a navigation message. In this study, we developed software to determine the orbit of a navigation satellite by combining an extended Kalman filter and an accurate dynamic model. Global positioning system (GPS) and quasi-zenith satellite system (QZSS) orbit determination was performed using international gnss system (IGS) ground station observations and user range error (URE), a key performance indicator of the navigation system, was calculated by comparison with IGS precise ephemeris. When estimating the clock error mounted on the navigation satellite, the radial orbital error and the clock error have a high inverse correlation, which cancel each other out, and the standard deviations of the URE of GPS and QZSS are small namely 1.99 m and 3.47 m, respectively. Instead of estimating the clock error of the navigation satellite, the orbit was determined by replacing the clock error of the navigation message with a modeled value, and the regional correlation with URE and the effect of the ground station arrangement were analyzed.

Propose a Static Web Standard Check Model

  • Hee-Yeon Won;Jae-Woong Kim;Young-Suk Chung
    • Journal of the Korea Society of Computer and Information
    • /
    • v.29 no.4
    • /
    • pp.83-89
    • /
    • 2024
  • After the end of the service of Internet Explorer, the use of ActiveX ended, and the Non-ActiveX policy spread. HTML5 is used as a standard protocol for web pages established based on the Non-ActiveX policy. HTML5, developed in the W3C(World Wide Web Consortium), provides a better web application experience through API, with various elements and properties added to the browser without plug-in. However, new security vulnerabilities have been discovered from newly added technologies, and these vulnerabilities have widened the scope of attacks. There is a lack of research to find possible security vulnerabilities in HTML5-applied websites. This paper proposes a model for detecting tags and attributes with web vulnerabilities by detecting and analyzing security vulnerabilities in web pages of public institutions where plug-ins have been removed within the last five years. If the proposed model is applied to the web page, it can analyze the compliance and vulnerabilities of the web page to date even after the plug-in is removed, providing reliable web services. And it is expected to help prevent financial and physical problems caused by hacking damage.

The Case on Valuation of IT Enterprise (IT 기업의 가치평가 사례연구)

  • Lee, Jae-Il;Yang, Hae-Sul
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.8 no.4
    • /
    • pp.881-893
    • /
    • 2007
  • IT(Information Technology)-based industries have caused a recent digital revolution and the appearance of various types' information service, being largely expanded toward info-communication device company, info-communication service company, software company etc.. Therefore, the needs to evaluate the company value of IT business for M&A or liquidation are growing tremendously. Unlike other industries, however, IT industry has a short lift cycle and so it doesn't have not only a company value-evaluating model for general businesses but the objective one for IT companies yet. So, this thesis analyzes various value-evaluating technique and newly rising ROV. DCF, the change method of company's cash flow including tangible assets into future value, had been applied during the past industrialization economy era and has been persuasively applied to the present. However, the DCF valuation has no option but to make many mistakes because IT companies have more intangible assets than tangible assets. Accordingly, it is ROV, recognized as the new method of evaluating companies' various options normally and quantitatively, that is brought up recently. But the evaluation on the companies' various options is too subjective and theoretical up to now and due to the lack of objective ground and options, it's not possible to be applied to reality. In this thesis, it is found that ROV is more accurate than DCF, comparing DCF and ROV through four examples. As the options applied to ROV are excessively limited, we tried to develop ROV into a new method by deriving five invisible value factors within IT companies. Therefore, on this occasion, we should set up the basic valuation methods on IT companies and should research and develop an effective and various valuation methods suitable to each company like an internet-based company, a S/W developing enterprise, a network-related company among IT companies.

  • PDF

Automation Tool Design for PL/SQL Applications Conversion (PL/SQL 응용프로그램 전환을 위한 자동화 도구 설계)

  • Jee, Jungeun;Lee, Jeongkun;Choi, Yongrak;Shin, Yongtae
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.7 no.8
    • /
    • pp.287-296
    • /
    • 2018
  • In the recent commercial DBMS market, as the users' burden and complaint which are related to high price licensing policy and late technical support service are increasingly rising, the concern for use of open source DBMS which has no problem with compatibility or stability is escalating. Due to the fact, the cases saving the cost are growing by converting Oracle Corporation's applications, which has about 60% share in the DBMS market, to an open source DBMS. However, in converting non-interchange sentences to an ANSI standard-based open source DBMS because of the use of PL/SQL in Oracle Corporation provides, a lot of manual work accompanies, so there is a lot of loss of time and money. Therefore, a tool that automatically converts PL/SQL to standard SQL is required. The proposed automation tool for the conversion of applications converts PL/SQL to Java Stored Procedure, an open source DBMS-based ANSI standard programming language. Through carrying out a test on the automation tool, it is proved that the tool will contribute to shortening time and saving cost by verifying the identity of input-output data and its reliability after correcting errors in converting to Java Stored Procedure.

A Mobile P2P Message Platform Enabling the Energy-Efficient Handover between Heterogeneous Networks (이종 네트워크 간 에너지 효율적인 핸드오버를 지원하는 모바일 P2P 메시지 플랫폼)

  • Kim, Tae-Yong;Kang, Kyung-Ran;Cho, Young-Jong
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.15 no.10
    • /
    • pp.724-739
    • /
    • 2009
  • This paper suggests the energy-efficient message delivery scheme and the software platform which exploits the multiple network interfaces of the mobile terminals and GPS in the current mobile devices. The mobile terminals determine the delivery method among 'direct', 'indirect', and 'WAN' based on the position information of itself and other terminals. 'Direct' method sends a message directly to the target terminal using local RAT. 'Indirect' method extends the service area by exploiting intermediate terminals as relay node. If the target terminal is too far to reach through 'direct' or 'indirect' method, the message is sent using wireless WAN technology. Our proposed scheme exploits the position information and, thus, power consumption is drastically reduced in determining handover time and direction. Network simulation results show that our proposed delivery scheme improves the message transfer efficiency and the handover detection latency. We implemented a message platform in a smart phone realizing the proposed delivery scheme. We compared our platform with other typical message platforms from energy efficiency aspect by observing the real power consumption and applying the mathematical modeling. The comparison results show that our platform requires significantly less power.

A Policy-Based Meta-Planning for General Task Management for Multi-Domain Services (다중 도메인 서비스를 위한 정책 모델 주도 메타-플래닝 기반 범용적 작업관리)

  • Choi, Byunggi;Yu, Insik;Lee, Jaeho
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.8 no.12
    • /
    • pp.499-506
    • /
    • 2019
  • An intelligent robot should decide its behavior accordingly to the dynamic changes in the environment and user's requirements by evaluating options to choose the best one for the current situation. Many intelligent robot systems that use the Procedural Reasoning System (PRS) accomplishes such task management functions by defining the priority functions in the task model and evaluating the priority functions of the applicable tasks in the current situation. The priority functions, however, are defined locally inside of the plan, which exhibits limitation for the tasks for multi-domain services because global contexts for overall prioritization are hard to be expressed in the local priority functions. Furthermore, since the prioritization functions are not defined as an explicit module, reuse or extension of the them for general context is limited. In order to remove such limitations, we propose a policy-based meta-planning for general task management for multi-domain services, which provides the ability to explicitly define the utility of a task in the meta-planning process and thus the ability to evaluate task priorities for general context combining the modular priority functions. The ontological specification of the model also enhances the scalability of the policy model. In the experiments, adaptive behavior of a robot according to the policy model are confirmed by observing the appropriate tasks are selected in dynamic service environments.

Towards Carbon-Neutralization: Deep Learning-Based Server Management Method for Efficient Energy Operation in Data Centers (탄소중립을 향하여: 데이터 센터에서의 효율적인 에너지 운영을 위한 딥러닝 기반 서버 관리 방안)

  • Sang-Gyun Ma;Jaehyun Park;Yeong-Seok Seo
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.12 no.4
    • /
    • pp.149-158
    • /
    • 2023
  • As data utilization is becoming more important recently, the importance of data centers is also increasing. However, the data center is a problem in terms of environment and economy because it is a massive power-consuming facility that runs 24 hours a day. Recently, studies using deep learning techniques to reduce power used in data centers or servers or predict traffic have been conducted from various perspectives. However, the amount of traffic data processed by the server is anomalous, which makes it difficult to manage the server. In addition, many studies on dynamic server management techniques are still required. Therefore, in this paper, we propose a dynamic server management technique based on Long-Term Short Memory (LSTM), which is robust to time series data prediction. The proposed model allows servers to be managed more reliably and efficiently in the field environment than before, and reduces power used by servers more effectively. For verification of the proposed model, we collect transmission and reception traffic data from six of Wikipedia's data centers, and then analyze and experiment with statistical-based analysis on the relationship of each traffic data. Experimental results show that the proposed model is helpful for reliably and efficiently running servers.

HEVC Encoder Optimization using Depth Information (깊이정보를 이용한 HEVC의 인코더 고속화 방법)

  • Lee, Yoon Jin;Bae, Dong In;Park, Gwang Hoon
    • Journal of Broadcast Engineering
    • /
    • v.19 no.5
    • /
    • pp.640-655
    • /
    • 2014
  • Many of today's video systems have additional depth camera to provide extra features such as 3D support. Thanks to these changes made in multimedia system, it is now much easier to obtain depth information of the video. Depth information can be used in various areas such as object classification, background area recognition, and so on. With depth information, we can achieve even higher coding efficiency compared to only using conventional method. Thus, in this paper, we propose the 2D video coding algorithm which uses depth information on top of the next generation 2D video codec HEVC. Background area can be recognized with depth information and by performing HEVC with it, coding complexity can be reduced. If current CU is background area, we propose the following three methods, 1) Earlier stop split structure of CU with PU SKIP mode, 2) Limiting split structure of CU with CU information in temporal position, 3) Limiting the range of motion searching. We implement our proposal using HEVC HM 12.0 reference software. With these methods results shows that encoding complexity is reduced more than 40% with only 0.5% BD-Bitrate loss. Especially, in case of video acquired through the Kinect developed by Microsoft Corp., encoding complexity is reduced by max 53% without a loss of quality. So, it is expected that these techniques can apply real-time online communication, mobile or handheld video service and so on.