• Title/Summary/Keyword: 브라우저

Search Result 1,016, Processing Time 0.022 seconds

Design of an Integrated University Information Service Model Based on Block Chain (블록체인 기반의 대학 통합 정보서비스 실증 모델 설계)

  • Moon, Sang Guk;Kim, Min Sun;Kim, Hyun Joo
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.20 no.2
    • /
    • pp.43-50
    • /
    • 2019
  • Block-chain enjoys technical advantages such as "robust security," owing to the structural characteristic that forgery is impossible, decentralization through sharing the ledger between participants, and the hyper-connectivity connecting Internet of Things, robots, and Artificial Intelligence. As a result, public organizations have highly positive attitudes toward the adoption of technology using block-chain, and the design of university information services is no exception. Universities are also considering the application of block-chain technology to foundations that implement various information services within a university. Through case studies of block-chain applications across various industries, this study designs an empirical model of an integrated information service platform that integrates information systems in a university. A basic road map of university information services is constructed based on block-chain technology, from planning to the actual service design stage. Furthermore, an actual empirical model of an integrated information service in a university is designed based on block-chain by applying this framework.

A Study on User Behavior of University Library Website based Big Data: Focusing on the Library of C University (빅데이터 기반 대학도서관 웹사이트 이용행태에 관한 연구: C대학교 도서관을 중심으로)

  • Lee, Sun Woo;Chang, Woo Kwon
    • Journal of the Korean Society for information Management
    • /
    • v.36 no.3
    • /
    • pp.149-174
    • /
    • 2019
  • This study analyzes the actual use data of the websites of university libraries, analyzes the users' usage behavior, and proposes improvement measures for the websites. The study analyzed users' traffic and analyzed their usage behavior from January 2018 to December 2018 on the C University website. The website's analysis tool used 'Google Analytics'. The web traffic variables were analyzed in five categories: user general characteristics, user environment analysis, visit analysis, inflow analysis, site analysis, and site analysis based on the metrics of sessions, users, page views, pages per session, average session time, and bounce rate. As a result, 1) In the analysis results of general characteristics of users, there was some access to the website not only in Korea but also in China. 2) In the user experience analysis, the main browser type appeared as Internet Explorer. The next place was Chrome, with a bounce rate of Safari, third and fourth, double that of the Explore or Chrome. In terms of screen resolution, 1920x1080 resolution accounted for the largest percentage, with access in a variety of other environments. 3) Direct inflow was the highest in the inflow media analysis. 4) The site analysis showed the most page views out of 4,534,084 pages, followed by the main page, followed by the lending/extension/history/booking page, the academic DB page, and the collection page.

Zkimi : Integrated Security Analysis Tool for UNIX Systems (지킴이: 유닉스 시스템을 위한 통합 보안 점검 도구)

  • 채흥석;이남희;김형호;김내희;차성덕;백석철;임규건;박승민;정종윤
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.7 no.3
    • /
    • pp.23-40
    • /
    • 1997
  • There are a lot of security tools for the investigation and improvement of UNIX systems. However, most of them fail to provide a consistent and usable user interface. In addition, they concentrate on a specific aspect of a system, not the whole one. For the overall management, system administrators cannot help using several uncomfortable tools. This paper introduces an integrated security analysis tool, named "Zkimi", which provides a convenient user interface and investigates the various aspects of UNIX systems such as account security, system security, network security, and file system integrity. The Zkimi supports user-friendly WWW based interface, so administrators can examine the various aspects of system more easily. We tried the tool for a system of a moderate size, and were confirmed that the tool is very efficient for investigating various security aspects of a system. a system.

A Study on Court Auction System using Ethereum-based Ether (이더리움 기반의 이더를 사용한 법원 경매 시스템에 관한 연구)

  • Kim, Hyo-Jong;Han, Kun-Hee;Shin, Seung-Soo
    • Journal of Convergence for Information Technology
    • /
    • v.11 no.2
    • /
    • pp.31-40
    • /
    • 2021
  • Blockchain technology is also actively studied in the real estate transaction field, and real estate transactions have various ways. In this paper, we propose a model that simplifies the authentication procedure of auction systems using Ethereum's Ether to solve the problem of offline court auctions. The proposed model is written in Ethereum's Solidity language, the court registers the sale date and the sale date with the DApp browser, and the bidder accesses the address of the individual's wallet created through Metamask's private key. The bidder then selects the desired sale and enters the bid price amount to participate in the auction. The bidder's record of the highest bid price for the sale he wants is written on the Ethereum test network as a smart contract. and creates a block. Finally, smart contracts written on the network are distributed by the court auction manager to all nodes in the blockchain network, and each node in the blockchain network can be viewed and contract verified. As a result of analyzing the smart contracts of the proposed model and the performance of the system, there are fees incurred due to the creation and use of Ether on platforms using Ethereum, and participation. Ether's changes in value affect the price of the sale, resulting in inconsistent fees in smart contracts each time. However, in future work, we issue our own tokens to solve the market volatility problem and commission problem with the value change of Ether, and refine complex court auction systems.

Web viewer for sharing of prosthesis design between laboratory and clinic: Case report (웹뷰어를 이용한 기공실과 진료실 측 간의 보철물 설계 형태의 공유: 증례 보고)

  • Jang, Sung Won;Lee, Ho Jin;Kim, So-Yeun;Lee, Du-Hyeong
    • The Journal of Korean Academy of Prosthodontics
    • /
    • v.60 no.3
    • /
    • pp.276-282
    • /
    • 2022
  • Close communication between clinicians and dental technicians is an important factor in providing successful prostheses. The exchange of opinions with laboratories has mainly been in the form of written prescriptions and a photos, but it has been reported that information transmission may be limited. Currently, as digital technology-based prosthesis fabrication is common, 3D image objects can be stored on the web and can be easily viewed through a mobile web browser. In this article, we introduce cases where the design of the prosthesis was improved by designing the prosthesis using CAD software and reviewing the prosthesis designed with the clinical side through a web viewer. Through this protocol, it was possible to improve the occlusal surface and crown contour, the opposing teeth condition, the size of the gingival embrasure, and the shape of pontic. The process of sharing, discussing, and modifying the prosthesis design with the clinician and technician through a web viewer contributes to reflecting the diversity of oral conditions and individualized needs, thereby helping to make functional and esthetic prostheses.

A Study on the remote acuisition of HejHome Air Cloud artifacts (스마트 홈 헤이 홈 Air의 클라우드 아티팩트 원격 수집 방안 연구)

  • Kim, Ju-eun;Seo, Seung-hee;Cha, Hae-seong;Kim, Yeok;Lee, Chang-hoon
    • Journal of Internet Computing and Services
    • /
    • v.23 no.5
    • /
    • pp.69-78
    • /
    • 2022
  • As the use of Internet of Things (IoT) devices has expanded, digital forensics coverage of the National Police Agency has expanded to smart home areas. Accordingly, most of the existing studies conducted to acquire smart home platform data were mainly conducted to analyze local data of mobile devices and analyze network perspectives. However, meaningful data for evidence analysis is mainly stored on cloud storage on smart home platforms. Therefore, in this paper, we study how to acquire stored in the cloud in a Hey Home Air environment by extracting accessToken of user accounts through a cookie database of browsers such as Microsoft Edge, Google Chrome, Mozilia Firefox, and Opera, which are recorded on a PC when users use the Hey Home app-based "Hey Home Square" service. In this paper, the it was configured with smart temperature and humidity sensors, smart door sensors, and smart motion sensors, and artifacts such as temperature and humidity data by date and place, device list used, and motion detection records were collected. Information such as temperature and humidity at the time of the incident can be seen from the results of the artifact analysis and can be used in the forensic investigation process. In addition, the cloud data acquisition method using OpenAPI proposed in this paper excludes the possibility of modulation during the data collection process and uses the API method, so it follows the principle of integrity and reproducibility, which are the principles of digital forensics.

Oil Spill Monitoring in Norilsk, Russia Using Google Earth Engine and Sentinel-2 Data (Google Earth Engine과 Sentinel-2 위성자료를 이용한 러시아 노릴스크 지역의 기름 유출 모니터링)

  • Minju Kim;Chang-Uk Hyun
    • Korean Journal of Remote Sensing
    • /
    • v.39 no.3
    • /
    • pp.311-323
    • /
    • 2023
  • Oil spill accidents can cause various environmental issues, so it is important to quickly assess the extent and changes in the area and location of the spilled oil. In the case of oil spill detection using satellite imagery, it is possible to detect a wide range of oil spill areas by utilizing the information collected from various sensors equipped on the satellite. Previous studies have analyzed the reflectance of oil at specific wavelengths and have developed an oil spill index using bands within the specific wavelength ranges. When analyzing multiple images before and after an oil spill for monitoring purposes, a significant amount of time and computing resources are consumed due to the large volume of data. By utilizing Google Earth Engine, which allows for the analysis of large volumes of satellite imagery through a web browser, it is possible to efficiently detect oil spills. In this study, we evaluated the applicability of four types of oil spill indices in the area of various land cover using Sentinel-2 MultiSpectral Instrument data and the cloud-based Google Earth Engine platform. We assessed the separability of oil spill areas by comparing the index values for different land covers. The results of this study demonstrated the efficient utilization of Google Earth Engine in oil spill detection research and indicated that the use of oil spill index B ((B3+B4)/B2) and oil spill index C (R: B3/B2, G: (B3+B4)/B2, B: (B6+B7)/B5) can contribute to effective oil spill monitoring in other regions with complex land covers.

Study on the Analysis of National Paralympics by Utilizing Social Big Data Text Mining (소셜 빅데이터 텍스트 마이닝을 활용한 전국장애인체육대회 분석 연구)

  • Kim, Dae kyung;Lee, Hyun Su
    • 한국체육학회지인문사회과학편
    • /
    • v.55 no.6
    • /
    • pp.801-810
    • /
    • 2016
  • The purpose of the study was to conduct a text mining examining keywords related to the National Paralympics and provide the fundamental information that would be used to change perception of people without disabilities toward disabilities and to promote the social participation of people with and without disabilities in the National Paralympics. Social big data regarding the National Paralympics were retrieved from news articles and blog postings identified by search engines, Naver, Daum, and Google. The data were then analysed using R-3.3.1 Version Program. The analysing techniques were cloud analysis, correlation analysis and social network analysis. The results were as follows. First, news were mainly related to game results, sports events, team participation and host avenue of the 33rd ~ 36th National Paralympics. Second, search results about the 33rd ~ 36th National Paralympics between Naver, Daum, and Google were similar to one another. Thirds, the keywrods, National Paralympics, sports for the disabled, and sports, demonstrated a high close centrality. Further, degree centrality and betweenness centrality were associated in the keywords such as sports for all, participation, research, development, sports-disabled, research-disabled, sports for all-participation, disabled-participation, sports for all-disabled, and host-paralympics.

Propose a Static Web Standard Check Model

  • Hee-Yeon Won;Jae-Woong Kim;Young-Suk Chung
    • Journal of the Korea Society of Computer and Information
    • /
    • v.29 no.4
    • /
    • pp.83-89
    • /
    • 2024
  • After the end of the service of Internet Explorer, the use of ActiveX ended, and the Non-ActiveX policy spread. HTML5 is used as a standard protocol for web pages established based on the Non-ActiveX policy. HTML5, developed in the W3C(World Wide Web Consortium), provides a better web application experience through API, with various elements and properties added to the browser without plug-in. However, new security vulnerabilities have been discovered from newly added technologies, and these vulnerabilities have widened the scope of attacks. There is a lack of research to find possible security vulnerabilities in HTML5-applied websites. This paper proposes a model for detecting tags and attributes with web vulnerabilities by detecting and analyzing security vulnerabilities in web pages of public institutions where plug-ins have been removed within the last five years. If the proposed model is applied to the web page, it can analyze the compliance and vulnerabilities of the web page to date even after the plug-in is removed, providing reliable web services. And it is expected to help prevent financial and physical problems caused by hacking damage.

A Proposal of a Keyword Extraction System for Detecting Social Issues (사회문제 해결형 기술수요 발굴을 위한 키워드 추출 시스템 제안)

  • Jeong, Dami;Kim, Jaeseok;Kim, Gi-Nam;Heo, Jong-Uk;On, Byung-Won;Kang, Mijung
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.3
    • /
    • pp.1-23
    • /
    • 2013
  • To discover significant social issues such as unemployment, economy crisis, social welfare etc. that are urgent issues to be solved in a modern society, in the existing approach, researchers usually collect opinions from professional experts and scholars through either online or offline surveys. However, such a method does not seem to be effective from time to time. As usual, due to the problem of expense, a large number of survey replies are seldom gathered. In some cases, it is also hard to find out professional persons dealing with specific social issues. Thus, the sample set is often small and may have some bias. Furthermore, regarding a social issue, several experts may make totally different conclusions because each expert has his subjective point of view and different background. In this case, it is considerably hard to figure out what current social issues are and which social issues are really important. To surmount the shortcomings of the current approach, in this paper, we develop a prototype system that semi-automatically detects social issue keywords representing social issues and problems from about 1.3 million news articles issued by about 10 major domestic presses in Korea from June 2009 until July 2012. Our proposed system consists of (1) collecting and extracting texts from the collected news articles, (2) identifying only news articles related to social issues, (3) analyzing the lexical items of Korean sentences, (4) finding a set of topics regarding social keywords over time based on probabilistic topic modeling, (5) matching relevant paragraphs to a given topic, and (6) visualizing social keywords for easy understanding. In particular, we propose a novel matching algorithm relying on generative models. The goal of our proposed matching algorithm is to best match paragraphs to each topic. Technically, using a topic model such as Latent Dirichlet Allocation (LDA), we can obtain a set of topics, each of which has relevant terms and their probability values. In our problem, given a set of text documents (e.g., news articles), LDA shows a set of topic clusters, and then each topic cluster is labeled by human annotators, where each topic label stands for a social keyword. For example, suppose there is a topic (e.g., Topic1 = {(unemployment, 0.4), (layoff, 0.3), (business, 0.3)}) and then a human annotator labels "Unemployment Problem" on Topic1. In this example, it is non-trivial to understand what happened to the unemployment problem in our society. In other words, taking a look at only social keywords, we have no idea of the detailed events occurring in our society. To tackle this matter, we develop the matching algorithm that computes the probability value of a paragraph given a topic, relying on (i) topic terms and (ii) their probability values. For instance, given a set of text documents, we segment each text document to paragraphs. In the meantime, using LDA, we can extract a set of topics from the text documents. Based on our matching process, each paragraph is assigned to a topic, indicating that the paragraph best matches the topic. Finally, each topic has several best matched paragraphs. Furthermore, assuming there are a topic (e.g., Unemployment Problem) and the best matched paragraph (e.g., Up to 300 workers lost their jobs in XXX company at Seoul). In this case, we can grasp the detailed information of the social keyword such as "300 workers", "unemployment", "XXX company", and "Seoul". In addition, our system visualizes social keywords over time. Therefore, through our matching process and keyword visualization, most researchers will be able to detect social issues easily and quickly. Through this prototype system, we have detected various social issues appearing in our society and also showed effectiveness of our proposed methods according to our experimental results. Note that you can also use our proof-of-concept system in http://dslab.snu.ac.kr/demo.html.