• Title/Summary/Keyword: Generate Data

Search Result 3,066, Processing Time 0.033 seconds

Awareness and Application of Internet of Things in Universities Libraries in Kwara State, Nigeria

  • Saliu Abdulfatai
    • International Journal of Knowledge Content Development & Technology
    • /
    • v.14 no.4
    • /
    • pp.65-84
    • /
    • 2024
  • The study was conducted on the awareness and application of internet of things in universities libraries in Kwara State, Nigeria. the study formulated used four research questions and used eighty five (85) samples as the population using total enumerative sampling techniques. A survey method was used in undertaking the study, in which answers were sought on the level of awareness of the internet of things in universities libraries in Kwara State, the extent of application of the internet of things in universities libraries in Kwara State, the benefit of internet of things in universities libraries in Kwara State, the challenges faced in the application of internet of things in universities libraries in Kwara State. The data collected from the study were analyzed using frequency tables and percentage. The study discovered that there the students are aware of the internet of things in universities libraries in Kwara State and the benefit of internet of a things include: Device in the IoT platforms are heterogeneous and are based on different hardware platforms and networks, It gives the high level of interoperability and interconnectivity, IoT platform has sensors which detect or measure any changes in the environment to generate data that can report on their status or even interact with the environment, IoT comes with the combination of algorithms and computation, software & hardware that makes it smart and Anything can be interconnected with the global information and communication infrastructure and the study identified data interpretation problem, Lack of skilled and specialized workers, Cost and Challenges in online security as well as Software complexity are major challenges faced in the application of internet of things in universities libraries in Kwara State. In conclusion the study made some recommendations which include that: Future libraries should be equipped with new technologies and networking devices as soon as possible. As this will be essential for users and librarians to have sufficient knowledge about IOT technologies.

3D Adjacency Spatial Query using 3D Topological Network Data Model (3차원 네트워크 기반 위상학적 데이터 모델을 이용한 3차원 인접성 공간질의)

  • Lee, Seok-Ho;Park, Se-Ho;Lee, Ji-Yeong
    • Spatial Information Research
    • /
    • v.18 no.5
    • /
    • pp.93-105
    • /
    • 2010
  • Spatial neighborhoods are spaces which are relate to target space. A 3D spatial query which is a function for searching spatial neighborhoods is a significant function in spatial analysis. Various methodologies have been proposed in related these studies, this study suggests an adjacent based methodology. The methodology of this paper implements topological data for represent a adjacency via using network based topological data model, then apply modifiable Dijkstra's algorithm to each topological data. Results of ordering analysis about an adjacent space from a target space were visualized and considered ways to take advantage of. Object of this paper is to implement a 3D spatial query for searching a target space with a adjacent relationship in 3D space. And purposes of this study are to 1)generate adjacency based 3D network data via network based topological data model and to 2)implement a 3D spatial query for searching spatial neighborhoods by applying Dijkstra's algorithms to these data.

Design and Implementation of Data Distribution Management Module for IEEE 1516 HLA/RTI (IEEE 1516 HLA/RTI 표준을 만족하는 데이터 분산 관리 모듈의 설계 및 구현)

  • Ahn, Jung-Hyun;Hong, Jeong-Hee;Kim, Tag-Gon
    • Journal of the Korea Society for Simulation
    • /
    • v.17 no.2
    • /
    • pp.21-29
    • /
    • 2008
  • The High Level Architecture(HLA) specifies a framework for interoperation between heterogeneous simulators, and Run-Time Infrastructure(RTI) is a implementation of the HLA Interface Specification. The Data Distribution Management(DDM) services, one category of IEEE 1516 HLA/RTI management services, control filters for data transmission and reception of data volume among simulators. In this paper, we propose design concept of DDM and show its implementation for light-weighted RTI. The design concept of DDM is to minimize total amount of message that each federate and a federation process generate using the rate of RTI service execution. The design of our proposed DDM follows that a data transfer mechanism is differently applied as the rate of RTI service execution. A federate usually publishes or subscribes data when it starts. The federate constantly updates the data and modifies associated regions while it continues to advance its simulation time. Therefore, the proposed DDM design provides fast update or region modification in exchange of complex publish and subscribe services. We describe how to process the proposed DDM in IEEE 1516 HLA/RTI and experiment variable scenarios while modifying region, changing overlap ratio, and increasing data volume.

  • PDF

A Trustworthiness Improving Link Evaluation Technique for LOD considering the Syntactic Properties of RDFS, OWL, and OWL2 (RDFS, OWL, OWL2의 문법특성을 고려한 신뢰향상적 LOD 연결성 평가 기법)

  • Park, Jaeyeong;Sohn, Yonglak
    • Journal of KIISE:Databases
    • /
    • v.41 no.4
    • /
    • pp.226-241
    • /
    • 2014
  • LOD(Linked Open Data) is composed of RDF triples which are based on ontologies. They are identified, linked, and accessed under the principles of linked data. Publications of LOD data sets lead to the extension of LOD cloud and ultimately progress to the web of data. However, if ontologically the same things in different LOD data sets are identified by different URIs, it is difficult to figure out their sameness and to provide trustworthy links among them. To solve this problem, we suggest a Trustworthiness Improving Link Evaluation, TILE for short, technique. TILE evaluates links in 4 steps. Step 1 is to consider the inference property of syntactic elements in LOD data set and then generate RDF triples which have existed implicitly. In Step 2, TILE appoints predicates, compares their objects in triples, and then evaluates links between the subjects in the triples. In Step 3, TILE evaluates the predicates' syntactic property at the standpoints of subject description and vocabulary definition and compensates the evaluation results of Step 2. The syntactic elements considered by TILE contain RDFS, OWL, OWL2 which are recommended by W3C. Finally, TILE makes the publisher of LOD data set review the evaluation results and then decide whether to re-evaluate or finalize the links. This leads the publishers' responsibility to be reflected in the trustworthiness of links among the data published.

The Improvement of Point Cloud Data Processing Program For Efficient Earthwork BIM Design (토공 BIM 설계 효율화를 위한 포인트 클라우드 데이터 처리 프로그램 개선에 관한 연구)

  • Kim, Heeyeon;Kim, Jeonghwan;Seo, Jongwon;Shim, Ho
    • Korean Journal of Construction Engineering and Management
    • /
    • v.21 no.5
    • /
    • pp.55-63
    • /
    • 2020
  • Earthwork automation has emerged as a promising technology in the construction industry, and the application of earthwork automation technology is starting from the acquisition and processing of point cloud data of the site. Point cloud data has more than a million data due to vast extent of the construction site, and the processing time of the original point cloud data is critical because it takes tens or hundreds of hours to generate a Digital Terrain Model (DTM), and enhancement of the processing time can largely impact on the efficiency of the modeling. Currently, a benchmark program (BP) is actively used for the purpose of both point cloud data processing and BIM design as an integrated program in Korea, however, there are some aspects to be modified and refined. This study modified the BP, and developed an updated program by adopting a compile-based development environment, newly designed UI/UX, and OpenGL while maintaining existing PCD processing functions, and expended compatibility of the PCD file formats. We conducted a comparative test in terms of loading speed with different number of point cloud data, and the results showed that 92 to 99% performance increase was found in the developed program. This program can be used as a foundation for the development of a program that reduces the gap between design and construction by integrating PCD and earthwork BIM functions in the future.

Development the Geostationary Ocean Color Imager (GOCI) Data Processing System (GDPS) (정지궤도 해색탑재체(GOCI) 해양자료처리시스템(GDPS)의 개발)

  • Han, Hee-Jeong;Ryu, Joo-Hyung;Ahn, Yu-Hwan
    • Korean Journal of Remote Sensing
    • /
    • v.26 no.2
    • /
    • pp.239-249
    • /
    • 2010
  • The Geostationary Ocean Color Imager (GOCI) data-processing system (GDPS), which is a software system for satellite data processing and analysis of the first geostationary ocean color observation satellite, has been developed concurrently with the development of th satellite. The GDPS has functions to generate level 2 and 3 oceanographic analytical data, from level 1B data that comprise the total radiance information, by programming a specialized atmospheric algorithm and oceanic analytical algorithms to the software module. The GDPS will be a multiversion system not only as a standard Korea Ocean Satellite Center(KOSC) operational system, but also as a basic GOCI data-processing system for researchers and other users. Additionally, the GDPS will be used to make the GOCI images available for distribution by satellite network, to calculate the lookup table for radiometric calibration coefficients, to divide/mosaic several region images, to analyze time-series satellite data. the developed GDPS system has satisfied the user requirement to complete data production within 30 minutes. This system is expected to be able to be an excellent tool for monitoring both long-term and short-term changes of ocean environmental characteristics.

Resolving the 'Gray sheep' Problem Using Social Network Analysis (SNA) in Collaborative Filtering (CF) Recommender Systems (소셜 네트워크 분석 기법을 활용한 협업필터링의 특이취향 사용자(Gray Sheep) 문제 해결)

  • Kim, Minsung;Im, Il
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.2
    • /
    • pp.137-148
    • /
    • 2014
  • Recommender system has become one of the most important technologies in e-commerce in these days. The ultimate reason to shop online, for many consumers, is to reduce the efforts for information search and purchase. Recommender system is a key technology to serve these needs. Many of the past studies about recommender systems have been devoted to developing and improving recommendation algorithms and collaborative filtering (CF) is known to be the most successful one. Despite its success, however, CF has several shortcomings such as cold-start, sparsity, gray sheep problems. In order to be able to generate recommendations, ordinary CF algorithms require evaluations or preference information directly from users. For new users who do not have any evaluations or preference information, therefore, CF cannot come up with recommendations (Cold-star problem). As the numbers of products and customers increase, the scale of the data increases exponentially and most of the data cells are empty. This sparse dataset makes computation for recommendation extremely hard (Sparsity problem). Since CF is based on the assumption that there are groups of users sharing common preferences or tastes, CF becomes inaccurate if there are many users with rare and unique tastes (Gray sheep problem). This study proposes a new algorithm that utilizes Social Network Analysis (SNA) techniques to resolve the gray sheep problem. We utilize 'degree centrality' in SNA to identify users with unique preferences (gray sheep). Degree centrality in SNA refers to the number of direct links to and from a node. In a network of users who are connected through common preferences or tastes, those with unique tastes have fewer links to other users (nodes) and they are isolated from other users. Therefore, gray sheep can be identified by calculating degree centrality of each node. We divide the dataset into two, gray sheep and others, based on the degree centrality of the users. Then, different similarity measures and recommendation methods are applied to these two datasets. More detail algorithm is as follows: Step 1: Convert the initial data which is a two-mode network (user to item) into an one-mode network (user to user). Step 2: Calculate degree centrality of each node and separate those nodes having degree centrality values lower than the pre-set threshold. The threshold value is determined by simulations such that the accuracy of CF for the remaining dataset is maximized. Step 3: Ordinary CF algorithm is applied to the remaining dataset. Step 4: Since the separated dataset consist of users with unique tastes, an ordinary CF algorithm cannot generate recommendations for them. A 'popular item' method is used to generate recommendations for these users. The F measures of the two datasets are weighted by the numbers of nodes and summed to be used as the final performance metric. In order to test performance improvement by this new algorithm, an empirical study was conducted using a publically available dataset - the MovieLens data by GroupLens research team. We used 100,000 evaluations by 943 users on 1,682 movies. The proposed algorithm was compared with an ordinary CF algorithm utilizing 'Best-N-neighbors' and 'Cosine' similarity method. The empirical results show that F measure was improved about 11% on average when the proposed algorithm was used

    . Past studies to improve CF performance typically used additional information other than users' evaluations such as demographic data. Some studies applied SNA techniques as a new similarity metric. This study is novel in that it used SNA to separate dataset. This study shows that performance of CF can be improved, without any additional information, when SNA techniques are used as proposed. This study has several theoretical and practical implications. This study empirically shows that the characteristics of dataset can affect the performance of CF recommender systems. This helps researchers understand factors affecting performance of CF. This study also opens a door for future studies in the area of applying SNA to CF to analyze characteristics of dataset. In practice, this study provides guidelines to improve performance of CF recommender systems with a simple modification.

  • Implementation of vehicle state monitoring system using WCDMA (WCDMA를 이용한 자동차 상태 모니터링 시스템 구현)

    • Song, Min-Seob;Baek, Sung-Hyun;Jang, Jong-Wook
      • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
      • /
      • 2012.05a
      • /
      • pp.343-346
      • /
      • 2012
    • Today, According to it widely used that third generation mobile networks service, WCDMA module development technology and its utilization is expanding increasingly and thus to IT Convergence industries is a trend that a lot of appears. This paper, OBD-II communications to bring a vehicle information use and data to an external server transfer and it was developed that vehicle status monitoring from other external devices can be system. From the various sensors inside a vehicle using the OBD-II connector read the information, after converting the user easier to see, WCDMA module using the car status monitoring system to transfer to an external data server was implemented. Developed to test the performance of the system using virtual's state vehicle simulator, which occurs in the real car that will generate the data. It was sent to the OBD-II connector to be occurred data, vehicle status monitoring system was confirmed that Data was received without error. In addition, When this data using WCDMA transmit data to an external server the same data was confirmed that it was received without error. In the future, this technology using the OBD-II vehicle IT convergence technology can be used in a wide range.

    • PDF

    Seismic AVO Analysis, AVO Modeling, AVO Inversion for understanding the gas-hydrate structure (가스 하이드레이트 부존층의 구조파악을 위한 탄성파 AVO 분석 AVO모델링, AVO역산)

    • Kim Gun-Duk;Chung Bu-Heung
      • 한국신재생에너지학회:학술대회논문집
      • /
      • 2005.06a
      • /
      • pp.643-646
      • /
      • 2005
    • The gas hydrate exploration using seismic reflection data, the detection of BSR(Bottom Simulating Reflector) on the seismic section is the most important work flow because the BSR have been interpreted as being formed at the base of a gas hydrate zone. Usually, BSR has some dominant qualitative characteristics on seismic section i.e. Wavelet phase reversal compare to sea bottom signal, Parallel layer with sea bottom, Strong amplitude, Masking phenomenon above the BSR, Cross bedding with other geological layer. Even though a BSR can be selected on seismic section with these guidance, it is not enough to conform as being true BSR. Some other available methods for verifying the BSR with reliable analysis quantitatively i.e. Interval velocity analysis, AVO(Amplitude Variation with Offset)analysis etc. Usually, AVO analysis can be divided by three main parts. The first part is AVO analysis, the second is AVO modeling and the last is AVO inversion. AVO analysis is unique method for detecting the free gas zone on seismic section directly. Therefore it can be a kind of useful analysis method for discriminating true BSR, which might arise from an Possion ratio contrast between high velocity layer, partially hydrated sediment and low velocity layer, water saturated gas sediment. During the AVO interpretation, as the AVO response can be changed depend upon the water saturation ratio, it is confused to discriminate the AVO response of gas layer from dry layer. In that case, the AVO modeling is necessary to generate synthetic seismogram comparing with real data. It can be available to make conclusions from correspondence or lack of correspondence between the two seismograms. AVO inversion process is the method for driving a geological model by iterative operation that the result ing synthetic seismogram matches to real data seismogram wi thin some tolerance level. AVO inversion is a topic of current research and for now there is no general consensus on how the process should be done or even whether is valid for standard seismic data. Unfortunately, there are no well log data acquired from gas hydrate exploration area in Korea. Instead of that data, well log data and seismic data acquired from gas sand area located nearby the gas hydrate exploration area is used to AVO analysis, As the results of AVO modeling, type III AVO anomaly confirmed on the gas sand layer. The Castagna's equation constant value for estimating the S-wave velocity are evaluated as A=0.86190, B=-3845.14431 respectively and water saturation ratio is $50\%$. To calculate the reflection coefficient of synthetic seismogram, the Zoeppritz equation is used. For AVO inversion process, the dataset provided by Hampson-Rushell CO. is used.

    • PDF

    Validation of Surface Reflectance Product of KOMPSAT-3A Image Data Using RadCalNet Data (RadCalNet 자료를 이용한 다목적실용위성 3A 영상 자료의 지표 반사도 성과 검증)

    • Lee, Kiwon;Kim, Kwangseob
      • Korean Journal of Remote Sensing
      • /
      • v.36 no.2_1
      • /
      • pp.167-178
      • /
      • 2020
    • KOMPSAT-3A images have been used in various kinds of applications, since its launch in 2015. However, there were limits to scientific analysis and application extensions of these data, such as vegetation index estimation, because no tool was developed to obtain the surface reflectance required for analysis of the actual land environment. The surface reflectance is a product of performing an absolute atmospheric correction or calibration. The objective of this study is to quantitatively verify the accuracy of top-of-atmosphere reflectance and surface reflectance of KOMPSAT-3A images produced from the OTB open-source extension program, performing the cross-validation with those provided by a site measurement data of RadCalNet, an international Calibration/Validation (Cal/Val) portal. Besides, surface reflectance was obtained from Landsat-8 OLI images in the same site and applied together to the cross-validation process. According to the experiment, it is proven that the top-of-atmosphere reflectance of KOMPSAT-3A images differs by up to ± 0.02 in the range of 0.00 to 1.00 compared to the mean value of the RadCalNet data corresponding to the same spectral band. Surface reflectance in KOMPSAT-3A images also showed a high degree of consistency with RadCalNet data representing the difference of 0.02 to 0.04. These results are expected to be applicable to generate the value-added products of KOMPSAT-3A images as analysisready data (ARD). The tools applied in thisstudy and the research scheme can be extended as the new implementation of each sensor model to new types of multispectral images of compact advanced satellites (CAS) for land, agriculture, and forestry and the verification method, respectively.


    (34141) Korea Institute of Science and Technology Information, 245, Daehak-ro, Yuseong-gu, Daejeon
    Copyright (C) KISTI. All Rights Reserved.