• Title/Summary/Keyword: URL Information

Search Result 329, Processing Time 0.026 seconds

SHRT : New Method of URL Shortening including Relative Word of Target URL (SHRT : 유사 단어를 활용한 URL 단축 기법)

  • Yoon, Soojin;Park, Jeongeun;Choi, Changkuk;Kim, Seungjoo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.38B no.6
    • /
    • pp.473-484
    • /
    • 2013
  • Shorten URL service is the method of using short URL instead of long URL, it redirect short url to long URL. While the users of microblog increased rapidly, as the creating and usage of shorten URL is convenient, shorten url became common under the limited length of writing on microblog. E-mail, SMS and books use shorten URL well, because of its simplicity. But, there is no relativeness between the most of shorten URLs and their target URLs, user can not expect the target URL. To cover this problem, there is attempts such as changing the shorten URL service name, inserting the information of website into shorten URL, and the usage of shortcode of physical address. However, each ones has the limits, so these are the trouble of automation, relatively long address, and the narrowness of applicable targets. SHRT is complementary to the attempts, as getting the idea from the writing system of Arabic. Though the writing system of Arabic has no vowel alphabet, Arabs have no difficult to understand their writing. This paper proposes SHRT, new method of URL Shortening. SHRT makes user guess the target URL using Relative word of the lowest domain of target URL without vowels.

Fast URL Lookup Using URL Prefix Hash Tree (URL Prefix 해시 트리를 이용한 URL 목록 검색 속도 향상)

  • Park, Chang-Wook;Hwang, Sun-Young
    • Journal of KIISE:Information Networking
    • /
    • v.35 no.1
    • /
    • pp.67-75
    • /
    • 2008
  • In this paper, we propose an efficient URL lookup algorithm for URL list-based web contents filtering systems. Converting a URL list into URL prefix form and building a hash tree representation of them, the proposed algorithm performs tree searches for URL lookups. It eliminates redundant searches of hash table method. Experimental results show that proposed algorithm is $62%{\sim}210%$ faster, depending on the number of segment, than conventional hash table method.

Evaluating Site-based URL Normalization (사이트 기반의 URL 정규화 평가)

  • Jeong, Hyo-Sook;Kim, Sung-Jin;Lee, Sang-Ho
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2005.11b
    • /
    • pp.28-30
    • /
    • 2005
  • URL 정규화는 다양하게 표현된 동일 URL들을 하나의 통일된(cannonical) 형태의 URL로 변환하는 과정이다. 동일문서에 대한 중복된 URL 표현은 URL 정규화를 통하여 제거된다. 표준 정규화는 잘못된 긍정(동일하지 않는 URL들을 동일 문자열로 변환)이 없도록 개발되었다. 그러나 표준 정규화는 많은 잘못된 부정이 발생하게 되므로, 잘못된 긍정을 일부 허용하면서 잘못된 부정을 현격히 줄일 수 있는 확장 정규화가 제기되고 연구되어 왔다. 본 논문에서는 동일 사이트 내의 URL들에 대한 확장 정규화의 적용 결과가 유사한 정도를 보임으로써, 한 사이트 내의 URL에 대한 임의의 확장 정규화 결과 정보가 동일 사이트 내의 다른 URL들의 정규화에 효과적으로 사용될 수 있음을 보인다. 이를 위하여, 한 사이트의 확장 정규화 결과 동일성 척도와 사이트 기반의 확장 정규화 평가 척도를 제안한다. 20,000만개의 실제 국내 웹 사이트에서 추출된 25만개의 URL에 대해 6가지 확장 정규화가 평가된다.

  • PDF

A Study on analysis tools in the SWF file URL (SWF 파일의 URL정보 분석도구)

  • Jang, Dong-Hwan;Song, Yu-Jin;Lee, Jae-Yong
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.15 no.5
    • /
    • pp.105-111
    • /
    • 2010
  • SWF(Shock Wave Flash) file is a format file for vector graphics produced by Adobe. It is widely used for a variety of contents such as advertising at websites, widgets, games, education, and videos and it contains various types of data such as sound sources, script, API and images. Many SWF files contain URL information on action script for communication in the network and they can be used as important research data as well as PC users' Web Browser history in terms of forensic investigation. And a decompiler for analyzing SWF files exists by which SWF files can be analysed and URL information can be verified. However, it takes a long time to verify the URL information on action scripts of multiple SWF files by the decompiler. In this paper, analysis of URL information on action scripts and extraction of URL information from multiple SWF files by designing analysis tools for URL information in SWF files is studied.

OTACUS: Parameter-Tampering Prevention Techniques using Clean URL (OTACUS: 간편URL기법을 이용한 파라미터변조 공격 방지기법)

  • Kim, Guiseok;Kim, Seungjoo
    • Journal of Internet Computing and Services
    • /
    • v.15 no.6
    • /
    • pp.55-64
    • /
    • 2014
  • In a Web application, you can pass without restrictions special network security devices such as IPS and F/W, URL parameter, which is an important element of communication between the client and the server, is forwarded to the Web server. Parameters are modulated by an attacker requests a URL, disclose confidential information or through e-commerce, can take financial gain. Vulnerability parameter manipulation thereof cannot be able to determine whether to operate in only determined logical application, blocked with Web Application Firewall. In this paper, I will present a technique OTACUS(One-Time Access Control URL System) to complement the shortcomings of the measures existing approaches. OTACUS can be effectively blocked the modulation of the POST or GET method parameters passed to the server by preventing the exposure of the URL to the attacker by using clean URL technique simplifies complex URL that contains the parameter. Performance test results of the actual implementation OTACUS proves that it is possible to show a stable operation of less than 3% increase in the load.

Development of Rule-Based Malicious URL Detection Library Considering User Experiences (사용자 경험을 고려한 규칙기반 악성 URL 탐지 라이브러리 개발)

  • Kim, Bo-Min;Han, Ye-Won;Kim, Ga-Young;Kim, Ye-Bun;Kim, Hyung-Jong
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.30 no.3
    • /
    • pp.481-491
    • /
    • 2020
  • The malicious URLs which can be used for sending malicious codes and illegally acquiring private information is one of the biggest threat of information security field. Particularly, recent prevalence of smart-phone increases the possibility of the user's exposing to malicious URLs. Since the way of hiding the URL from the user is getting more sophisticated, it is getting harder to detect it. In this paper, after conducting a survey of the user experiences related to malicious URLs, we are proposing the rule-based malicious URL detection method. In addition, we have developed java library which can be applied to any other applications which need to handle the malicious URL. Each class of the library is implementation of a rule for detecting a characteristics of a malicious URL and the library itself is the set of rule which can have the chain of rule for deteciing more complicated situation and enhancing the accuracy. This kinds of rule based approach can enhance the extensibility considering the diversity of malicious URLs.

URL Normalization for Web Applications (웹 어플리케이션을 위한 URL 정규화)

  • Hong, Seok-Hoo;Kim, Sung-Jin;Lee, Sang-Ho
    • Journal of KIISE:Information Networking
    • /
    • v.32 no.6
    • /
    • pp.716-722
    • /
    • 2005
  • In the m, syntactically different URLs could represent the same resource. The URL normalization is a process that transform a URL, syntactically different and represent the same resource, into canonical form. There are on-going efforts to define standard URL normalization. The standard URL normalization designed to minimize false negative while strictly avoiding false positive. This paper considers the four URL normalization issues beyond ones specified in the standard URL normalization. The idea behind our work is that in the URL normalization we want to minimize false negatives further while allowing false positives in a limited level. Two metrics are defined to analyze the effect of each step in the URL normalization. Over 170 million URLs that were collected in the real web pages, we did an experiment, and interesting statistical results are reported in this paper.

URL Signatures for Improving URL Normalization (URL 정규화 향상을 위한 URL 서명)

  • Soon, Lay-Ki;Lee, Sang-Ho
    • Journal of KIISE:Databases
    • /
    • v.36 no.2
    • /
    • pp.139-149
    • /
    • 2009
  • In the standard URL normalization mechanism, URLs are normalized syntactically by a set of predefined steps. In this paper, we propose to complement the standard URL normalization by incorporating the semantically meaningful metadata of the web pages. The metadata taken into consideration are the body texts and the page size of the web pages, which can be extracted during HTML parsing. The results from our first exploratory experiment indicate that the body texts are effective in identifying equivalent URLs. Hence, given a URL which has undergone the standard normalization, we construct its URL signature by hashing the body text of the associated web page using Message-Digest algorithm 5 in the second experiment. URLs which share identical signatures are considered to be equivalent in our scheme. The results in the second experiment show that our proposed URL signatures were able to further reduce redundant URLs by 32.94% in comparison with the standard URL normalization.

Web Traffic Analysis using URL- tree and URL-net (URL- tree와 URL-net를 사용한 인터넷 트래픽 분석)

  • 안광림;김기창
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 1998.10a
    • /
    • pp.486-488
    • /
    • 1998
  • 인터넷 사용의 증가로 인한 정체 현상을 극복하기 위한 방안으로 캐시 서버를 사용하고 있다. 캐시 서버의 보다 효율적인 사용을 위해 할당 네트웍의 트래픽에 대한 이해는 매우 중요하다. 즉, 보다 적극적인 캐싱 전략을 수립하기 위해 트래픽 분석이 선행되어야한다. 본 논문에서는 URL- tree와 URL-net이라는 자료구조를 제안하고, 이것을 이용하여 웹 트래픽 분석을 수행한다. 이러한 자료구조를 통해 웹 트래픽에 존재하는 '참조의 연결성'이라는 성질을 찾을 수 있다. 본 논문에서는 위의 두 자료구조들이 인터넷 트래픽을 분석하는데 어떻게 도움을 주고 그러한 분석이 효율적인 캐싱 전략을 수립하는데 어떻게 사용될 수 있는가를 보여준다.

  • PDF

Design and Implementation of a URL Forwarding Server for Providing Multiple Domain Names (다중 도메인명을 지원하는 URL 전달 서버의 설계 및 구현)

  • 노상호;장세현;김상연;양희재
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2000.04a
    • /
    • pp.418-420
    • /
    • 2000
  • 본 연구에서는 인터넷상에 존재하는 웹 사이트를 새로운 도메인 네임으로 연결시켜주는 URL forwarding 서버를 설계 및 구현하였다. URL forwarding 서버는 수많은 인터넷 사용자들의 웹 사이트를 간단하고 다양한 형태의 URL로 매핑 시켜준다. URL forwarding 서버는 HTTP 프로토콜에서 redirection 응답코드를 기반으로 구현되었으며 Linux 시스템에서 실험을 하였다. 본 논문에서는 HTTP와 forwarding 서버를 서로 비교 분석하여 고찰해보고 구현된 forwarding 서버의 동작에 대해서 알아본다.

  • PDF