• Title/Summary/Keyword: Dynamic Web Pages

Search Result 48, Processing Time 0.024 seconds

Case Study of Building Dynamic Homepage Using ActiveX Control (ActiveX 콘트롤을 이용한 동적 홈페이지의 설계와 구현 사례)

  • 우원택
    • Proceedings of the Korea Society for Industrial Systems Conference
    • /
    • 2003.05a
    • /
    • pp.27-40
    • /
    • 2003
  • The purpose of this study is to understand what is ActiveX control and how to utilize it in developing dynamic web pages. For this purpose, some literature survey and experimental practices with PC were done to understand the differences among web programming technologies such as visual basic, java, XML web services. In order to scrutinize the functions of ActiveX controls and the comparison of their technology with currently introduced XML web service, a study on the history of Internet programming with a focused view on ActiveX control was also performed. With the knowledge of the fore mentioned, Actual design and building experiments of web pages using ActiveX control pad were implemented. The results show that Microsoft ActiveX controls, formerly known as OLE controls or OCX controls, developed to compete with java applet in internet battle fields, turn out to be useful in software reusability, cost-saving and time-saving etc. However, the use of ActiveX controls has been confined in window platform, Overall, this study was useful for understanding the usage of ActiveX controls in web pages.

  • PDF

Case Study of Building Dynamic Homepage Using ActiveX Control Pad (ActiveX 컨트롤 패드를 이용한 동적홈페이지의 설계와 구현 사례)

  • 우원택
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.8 no.2
    • /
    • pp.108-118
    • /
    • 2003
  • The purpose of this study is to understand what is ActiveX control and how to utilize it in developing dynamic web pages. For this purpose, some literature survey and experimental practices with PC were done to understand the differences among web programming technologies such as visual basic, java, XML web services. In order to scrutinize the functions of ActiveX controls and the comparison of their technology with currently introduced XML web service, a study on the history of Internet programming with a focused view on ActiveX control was also performed. With the knowledge of the fore mentioned, Actual design and building experiments of web pages using ActiveX control pad were implemented. The results show that Microsoft ActiveX controls, formerly known as OLE controls or OCX controls, developed to compete with java applet in internet battle fields, turn out to be useful in software reusability, cost-saving and time-saving etc. However, the use of ActiveX controls has been confined in window platform, Overall, this study was useful for understanding the usage of ActiveX controls in web pages.

  • PDF

Improving the quality of Search engine by using the Intelligent agent technolo

  • Nauyen, Ha-Nam;Choi, Gyoo-Seok;Park, Jong-Jin;Chi, Sung-Do
    • Journal of the Korea Computer Industry Society
    • /
    • v.4 no.12
    • /
    • pp.1093-1102
    • /
    • 2003
  • The dynamic nature of the World Wide Web challenges Search engines to find relevant and recent pages. Obtaining important pages rapidly can be very useful when a crawler cannot visit the entire Web in a reasonable amount of time. In this paper we study way spiders that should visit the URLs in order to obtain more “important” pages first. We define and apply several metrics, ranking formula for improving crawling results. The comparison between our result and Breadth-first Search (BFS) method shows the efficiency of our experiment system.

  • PDF

Static Analysis of Web Accessibility Based on Abstract Parsing (요약파싱기법을 사용한 웹 접근성의 정적 분석)

  • Kim, Hyunha;Doh, Kyung-Goo
    • Journal of KIISE
    • /
    • v.41 no.12
    • /
    • pp.1099-1109
    • /
    • 2014
  • Web-accessibility evaluation tools can be used to determine whether or not a website meets accessibility guidelines. As such, many such tools have been developed for web accessibility, but most of them dynamically fetch and analyze pages and as a result, some pages maybe omitted due to the lack of access authorization or environment information. In this paper, we propose a static method that analyzes web accessibility via abstract parsing. Our abstract parsing technique understands syntactic and semantic program structures that dynamically generate web pages according to external inputs and parameters. The static method performs its analysis without omitting any pages because it covers all execution paths. We performed an experiment with a PHP-based website to demonstrate how our tool discovers more accessibility errors than a dynamic page accessibility analysis tool.

Comparison and Application of Dynamic and Static Crawling for Extracting Product Data from Web Pages (웹페이지에서의 상품 데이터 추출을 위한 동적, 정적 크롤링 비교 및 활용)

  • Sang-Hyuk Kim;Jeong-Hoon Kim;Seung-Dae Lee
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.18 no.6
    • /
    • pp.1277-1284
    • /
    • 2023
  • In this paper, a web page that is easy for consumers to access event products in progress at convenience stores was created. In the production process, static crawling and dynamic crawling, two crawling methods for extracting data from event products, were compared and used. Static crawling is an extraction method of collecting static data from a homepage, and dynamic crawling is a method of collecting data from pages dynamically generated from a web page. Through the comparison of the two crawlings, we studied which crawl method is more effective in extracting event product data. Among them, a web page was created using effective static crawling, and 1+1 and 2+1 products were categorized and a search function was added to create a web page.

JsSandbox: A Framework for Analyzing the Behavior of Malicious JavaScript Code using Internal Function Hooking

  • Kim, Hyoung-Chun;Choi, Young-Han;Lee, Dong-Hoon
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.6 no.2
    • /
    • pp.766-783
    • /
    • 2012
  • Recently, many malicious users have attacked web browsers using JavaScript code that can execute dynamic actions within the browsers. By forcing the browser to execute malicious JavaScript code, the attackers can steal personal information stored in the system, allow malware program downloads in the client's system, and so on. In order to reduce damage, malicious web pages must be located prior to general users accessing the infected pages. In this paper, a novel framework (JsSandbox) that can monitor and analyze the behavior of malicious JavaScript code using internal function hooking (IFH) is proposed. IFH is defined as the hooking of all functions in the modules using the debug information and extracting the parameter values. The use of IFH enables the monitoring of functions that API hooking cannot. JsSandbox was implemented based on a debugger engine, and some features were applied to detect and analyze malicious JavaScript code: detection of obfuscation, deobfuscation of the obfuscated string, detection of URLs related to redirection, and detection of exploit codes. Then, the proposed framework was analyzed for specific features, and the results demonstrate that JsSandbox can be applied to the analysis of the behavior of malicious web pages.

A Study on Minimizing Infection of Web-based Malware through Distributed & Dynamic Detection Method of Malicious Websites (악성코드 은닉사이트의 분산적, 동적 탐지를 통한 감염피해 최소화 방안 연구)

  • Shin, Hwa-Su;Moon, Jong-Sub
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.21 no.3
    • /
    • pp.89-100
    • /
    • 2011
  • As the Internet usage with web browser is more increasing, the web-based malware which is distributed in websites is going to more serious problem than ever. The central type malicious website detection method based on crawling has the problem that the cost of detection is increasing geometrically if the crawling level is lowered more. In this paper, we proposed a security tool based on web browser which can detect the malicious web pages dynamically and support user's safe web browsing by stopping navigation to a certain malicious URL injected to those web pages. By applying these tools with many distributed web browser users, all those users get to participate in malicious website detection and feedback. As a result, we can detect the lower link level of websites distributed and dynamically.

The type analysis of interactive web pages - Centering around a layout (인터렉티브한(Interactive) 웹 페이지(Web Page)의 유형분석-레이아웃(Layout)을 중심으로)

  • 최선주
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2004.05a
    • /
    • pp.566-574
    • /
    • 2004
  • Recently, common users are familiar with the various dynamic displays and interfaces owing to the developments of web browser and Flash. As Flash also has become popular, it has attracted more application of many users, which nowadays makes them focus on the unique characteristics of web - the interactive and dynamic design. Therefore, this dissertation performs the type-analysis of the layouts and the understanding of the actual condition for the existing interactive-sites, followed by the considering of the design-related methodology and the utility of Flash program.

  • PDF

Automatic Extraction of Dependencies between Web Components and Database Resources in Java Web Applications

  • Oh, Jaewon;Ahn, Woo Hyun;Kim, Taegong
    • Journal of information and communication convergence engineering
    • /
    • v.17 no.2
    • /
    • pp.149-160
    • /
    • 2019
  • Web applications typically interact with databases. Therefore, it is very crucial to understand which web components access which database resources when maintaining web apps. Existing research identifies interactions between Java web components, such as JavaServer Pages and servlets but does not extract dependencies between the web components and database resources, such as tables and attributes. This paper proposes a dynamic analysis of Java web apps, which extracts such dependencies from a Java web app and represents them as a graph. The key responsibility of our analysis method is to identify when web components access database resources. To fulfill this responsibility, our method dynamically observes the database-related objects provided in the Java standard library using the proxy pattern, which can be applied to control access to a desired object. This study also experiments with open source web apps to verify the feasibility of the proposed method.

An Extended Dynamic Web Page Recommendation Algorithm Based on Mining Frequent Traversal Patterns (빈발 순회패턴 탐사에 기반한 확장된 동적 웹페이지 추천 알고리즘)

  • Lee KeunSoo;Lee Chang Hoon;Yoon Sun-Hee;Lee Sang Moon;Seo Jeong Min
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.9
    • /
    • pp.1163-1176
    • /
    • 2005
  • The Web is the largest distributed information space but, the individual's capacity to read and digest contents is essentially fixed. In these Web environments, mining traversal patterns is an important problem in Web mining with a host of application domains including system design and information services. Conventional traversal pattern mining systems use the inter-pages association in sessions with only a very restricted mechanism (based on vector or matrix) for generating frequent K-Pagesets. We extend a family of novel algorithms (termed WebPR - Web Page Recommend) for mining frequent traversal patterns and then pageset to recommend. We add a WebPR(A) algorithm into a family of WebPR algorithms, and propose a new winWebPR(T) algorithm introducing a window concept on WebPR(T). Including two extended algorithms, our experimentation with two real data sets, including LadyAsiana and KBS media server site, clearly validates that our method outperforms conventional methods.

  • PDF