• Title/Summary/Keyword: Tracking Time

Search Result 3,297, Processing Time 0.047 seconds

Serial MR Imaging of Magnetically Labeled Humen Umbilical Vein Endothelial Cells in Acute Renal Failure Rat Model (급성 신부전 쥐 모델에서 자기 표지된 인간 제대정맥 내피세포의 연속 자기공명영상)

  • Lee, Sun Joo;Lee, Sang Yong;Kang, Kyung Pyo;Kim, Won;Park, Sung Kwang
    • Investigative Magnetic Resonance Imaging
    • /
    • v.17 no.3
    • /
    • pp.181-191
    • /
    • 2013
  • Purpose : To evaluate the usefulness of in vivo magnetic resonance (MR) imaging for tracking intravenously injected superparamagnetic iron oxide (SPIO)-labeled human umbilical vein endothelial cells (HUVECs) in an acute renal failure (ARF) rat model. Materials and Methods: HUVECs were labeled with SPIO and poly-L-lysine (PLL) complex. Relaxation rates at 1.5-T MR, cell viability, and labeling stability were assessed. HUVECs were injected into the tail vein of ARF rats (labeled cells in 10 rats, unlabeled cells in 2 rats). Follow-up serial $T2^*$-weighted gradient-echo MR imaging was performed at 1, 3, 5 and 7 days after injection, and the MR findings were compared with histologic findings. Results: There was an average of $98.4{\pm}2.4%$ Prussian blue stain-positive cells after labeling with SPIOPLL complex. Relaxation rates ($R2^*$) of all cultured HUVECs at day 3 and 5 were not markedly decreased compared with that at day 1. The stability of SPIO in HUVECs was maintained during the proliferation of HUVECs in culture media. In the presence of left unilateral renal artery ischemia, $T2^*$-weighted MR imaging performed 1 day after the intravenous injection of labeled HUVECs revealed a significant signal intensity (SI) loss exclusively in the left renal outer medulla regions, but not in the right kidney. The MR imaging findings at days 3, 5 and 7 after intravenous injection of HUVECs showed a SI loss in the outer medulla regions of the ischemically injured kidney, but the SI progressively recovered with time and the right kidney did not have a significant change in SI in the same period. Upon histologic analysis, the SI loss on MR images was correspondent to the presence of Prussian blue stained cells, primarily in the renal outer medulla. Conclusion: MR imaging appears to be useful for in vivo monitoring of intravenously injected SPIO-labeled HUVECs in an ischemically injured rat kidney.

Analysis of Behavioral Characteristics of Broilers by Feeding, Drinking, and Resting Spaces according to Stocking Density using Image Analysis Technique (영상분석기법을 활용한 사육밀도에 따른 급이·급수 및 휴식공간별 육계의 행동특성 분석)

  • Kim, Hyunsoo;Kang, HwanKu;Kang, Boseok;Kim, ChanHo
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.21 no.12
    • /
    • pp.558-569
    • /
    • 2020
  • This study examined the frequency of a broiler's stay in each area as stock density using an ICT-based image analysis technique from the perspective of precision livestock farming (PLF) according to the increase in the domestic broiler farms to understand the normal behavior patterns of broilers by age. The broiler was used in the experimental box (3.3×2.7 m) in a poultry house in Gyeonggi province. The stock densities were 9.5 birds/㎡ (n=85) and 19 birds/㎡ (n=170), respectively, and the frequency of stay by feeding, water, and rest area was monitored using a top-view camera. The image data of three-colored-specific broilers identified as the stock density were acquired by age (12, 16, 22, 27, and 29 days) for six hours. In the collected image data, the object tracking technique was used to record the cumulative movement path by connecting approximately 640,000 frames at 30 fps to quantify the frequency of stay in each area. In each stock density, it was significant in the order of the rest area, feeding, and water area (p<0.001). In 9.5 birds/㎡, it was at 57.9, 24.2, and 17.9 %, and 73.2, 16.8, and 10 % in 19 birds/㎡. The frequency of a broiler's stay could be evaluated in each area as the stock density using an ICT-based image analysis technique that minimizes stress. This method is expected to be used to provide basic material for developing an ICT-based management system through real-time monitoring.

An Artificial Intelligence Approach to Waterbody Detection of the Agricultural Reservoirs in South Korea Using Sentinel-1 SAR Images (Sentinel-1 SAR 영상과 AI 기법을 이용한 국내 중소규모 농업저수지의 수표면적 산출)

  • Choi, Soyeon;Youn, Youjeong;Kang, Jonggu;Park, Ganghyun;Kim, Geunah;Lee, Seulchan;Choi, Minha;Jeong, Hagyu;Lee, Yangwon
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.5_3
    • /
    • pp.925-938
    • /
    • 2022
  • Agricultural reservoirs are an important water resource nationwide and vulnerable to abnormal climate effects such as drought caused by climate change. Therefore, it is required enhanced management for appropriate operation. Although water-level tracking is necessary through continuous monitoring, it is challenging to measure and observe on-site due to practical problems. This study presents an objective comparison between multiple AI models for water-body extraction using radar images that have the advantages of wide coverage, and frequent revisit time. The proposed methods in this study used Sentinel-1 Synthetic Aperture Radar (SAR) images, and unlike common methods of water extraction based on optical images, they are suitable for long-term monitoring because they are less affected by the weather conditions. We built four AI models such as Support Vector Machine (SVM), Random Forest (RF), Artificial Neural Network (ANN), and Automated Machine Learning (AutoML) using drone images, sentinel-1 SAR and DSM data. There are total of 22 reservoirs of less than 1 million tons for the study, including small and medium-sized reservoirs with an effective storage capacity of less than 300,000 tons. 45 images from 22 reservoirs were used for model training and verification, and the results show that the AutoML model was 0.01 to 0.03 better in the water Intersection over Union (IoU) than the other three models, with Accuracy=0.92 and mIoU=0.81 in a test. As the result, AutoML performed as well as the classical machine learning methods and it is expected that the applicability of the water-body extraction technique by AutoML to monitor reservoirs automatically.

Development of tracer concentration analysis method using drone-based spatio-temporal hyperspectral image and RGB image (드론기반 시공간 초분광영상 및 RGB영상을 활용한 추적자 농도분석 기법 개발)

  • Gwon, Yeonghwa;Kim, Dongsu;You, Hojun;Han, Eunjin;Kwon, Siyoon;Kim, Youngdo
    • Journal of Korea Water Resources Association
    • /
    • v.55 no.8
    • /
    • pp.623-634
    • /
    • 2022
  • Due to river maintenance projects such as the creation of hydrophilic areas around rivers and the Four Rivers Project, the flow characteristics of rivers are continuously changing, and the risk of water quality accidents due to the inflow of various pollutants is increasing. In the event of a water quality accident, it is necessary to minimize the effect on the downstream side by predicting the concentration and arrival time of pollutants in consideration of the flow characteristics of the river. In order to track the behavior of these pollutants, it is necessary to calculate the diffusion coefficient and dispersion coefficient for each section of the river. Among them, the dispersion coefficient is used to analyze the diffusion range of soluble pollutants. Existing experimental research cases for tracking the behavior of pollutants require a lot of manpower and cost, and it is difficult to obtain spatially high-resolution data due to limited equipment operation. Recently, research on tracking contaminants using RGB drones has been conducted, but RGB images also have a limitation in that spectral information is limitedly collected. In this study, to supplement the limitations of existing studies, a hyperspectral sensor was mounted on a remote sensing platform using a drone to collect temporally and spatially higher-resolution data than conventional contact measurement. Using the collected spatio-temporal hyperspectral images, the tracer concentration was calculated and the transverse dispersion coefficient was derived. It is expected that by overcoming the limitations of the drone platform through future research and upgrading the dispersion coefficient calculation technology, it will be possible to detect various pollutants leaking into the water system, and to detect changes in various water quality items and river factors.

Behavior of amber fish, Seriola aureovittata released in the setnet (정치망내에 방류한 부시리, Seriola aureovittata 의 행동)

  • 신현옥;이주희
    • Journal of the Korean Society of Fisheries and Ocean Technology
    • /
    • v.35 no.2
    • /
    • pp.161-169
    • /
    • 1999
  • This paper describes the swimming and escaping behavior of amber fish, Seriola aureovittata released in the first bag net of the setnet and observed with telemetry techniques. The setnet used in experiment is composed of a leader, a fish court with a flying net and two bag nets having ramp net. The behavior of the fish attached an ultrasonic depth pinger of 50 KHz is observed using a prototype LBL fish tracking system. The 3-D underwater position ofthe fish is calculated by hyperbolic method with three channels of receiver and the depth of pinger. The results obtained are as follows: 1. The fish released on the sea surface was escaped down to 15 m depth and rised up to near the sea surface during 5 minutes after release. The average swimming speed of the fish during this time was 0.87 m/sec. 2. The swimming speed of the fish is decreased slowly in relation to the time elapsed and the fish showed some escaping behavior forward to the fish court staying 1 to 7 m depth layer near the ramp net. The average speed of the fish during this time was 0.52 m/sec. 3. During 25 minutes after beginning of hauling net, the fish showed a faster swimming speed than before hauling and an escaping behavior repeatedly from the first ramp net to the second one in horizontal. In vertical, the fish moved up and down between the sea surface and 20 m depth. After this time, the fish showed the escaping behavior forward to fish court after come back to the first ramp net in spite of the hauling was continued. It is found that the fish was escaped from the first ramp net to the fish court while the hauling was carried out. The average speed of the fish after beginning of hauling was 0.72 m/sec which increased 38.5 % than right before the hauling and showed 0.44 to 0.82 m/see of speed till escaping the first bag net. The average swimming speed during observation was 0.67 m/sec (2.2 times of body length).

  • PDF

Sediment Particulate Motions Over a Ripple Under Different Wave Amplitude Conditions (파랑에 의한 해저 사련 위에서의 유사입자의 거동 특성)

  • Chang, Yeon S.;Ahn, Kyungmo;Hwang, Jin H.;Park, Young-Gyu
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.25 no.6
    • /
    • pp.374-385
    • /
    • 2013
  • Sediment particle motions have been numerically simulated over a sinusoidal ripple. Turbulent boundary layer flows are generated by Large Eddy Simulation, and the sediment particle motions are simulated using Lagrangian particle tracking method. Two unsteady flow conditions are used in the experiment by employing two different wave amplitudes while keeping other conditions such as wave period same. As expected, the amount of suspended sediment particles is clearly dependent on the wave amplitude as it is increasing with increasing flow intensity. However, it is also observed that the pattern of suspension may be different as well due to the only different condition caused by wave amplitude. Specially, the time of maximum sediment suspension within the wave period is not coincident between the two cases because sediment suspension is strongly affected by the existence of turbulent eddies that are formed at different times over the ripple between the two cases as well. The role of these turbulent eddies on sediment suspension is important as it is also confirmed in previous researches. However, it is also found the time of these eddies' formation may also dependent on the wave amplitude over rippled beds. Therefore, it has been proved that various flow as well as geometric conditions under waves has to be considered in order to have better understanding on the sediment suspension process over ripples. In addition, it is found that high turbulent energy and strong upward flow velocities occur during the time of eddy formation, which also supports high suspension rate at these time steps. The results indicate that the relationship between the structure of flows and bedforms has to be carefully examined in studying sediment suspension at coastal regions.

A Study for Design and Performance Improvement of the High-Sensitivity Receiver Architecture based on Global Navigation Satellite System (GNSS 기반의 고감도 수신기 아키텍처 설계 및 성능 향상에 관한 연구)

  • Park, Chi-Ho;Oh, Young-Hwan
    • Journal of the Institute of Electronics Engineers of Korea TC
    • /
    • v.45 no.4
    • /
    • pp.9-21
    • /
    • 2008
  • In this paper, we propose a GNSS-based RF receiver, A high precision localization architecture, and a high sensitivity localization architecture in order to solve the satellite navigation system's problem mentioned above. The GNSS-based RF receiver model should have the structure to simultaneously receive both the conventional GPS and navigation information data of future-usable Galileo. As a result, it is constructed as the multi-band which can receive at the same time Ll band (1575.42MHz) of GPS and El band (1575.42MHz), E5A band (1207.1MHz), and E4B band (1176.45MHz) of Galileo This high precision localization architecture proposes a delay lock loop with the structure of Early_early code, Early_late code, Prompt code, Late_early code, and Late_late code other than Early code, Prompt code, and Late code which a previous delay lock loop structure has. As we suggest the delay lock loop structure of 1/4chips spacing, we successfully deal with the synchronization problem with the C/A code derived from inaccuracy of the signal received from the satellite navigation system. The synchronization problem with the C/A code causes an acquisition delay time problem of the vehicle navigation system and leads to performance reduction of the receiver. In addition, as this high sensitivity localization architecture is designed as an asymmetry structure using 20 correlators, maximizes reception amplification factor, and minimizes noise, it improves a reception rate. Satellite navigation system repeatedly transmits the same C/A code 20 times. Consequently, we propose a structure which can use all of the same C/A code. Since this has an adaptive structure and can limit(offer) the number of the correlator according to the nearby environment, it can reduce unnecessary delay time of the system. With the use of this structure, we can lower the acquisition delay time and guarantee the continuity of tracking.

Development of an Offline Based Internal Organ Motion Verification System during Treatment Using Sequential Cine EPID Images (연속촬영 전자조사 문 영상을 이용한 오프라인 기반 치료 중 내부 장기 움직임 확인 시스템의 개발)

  • Ju, Sang-Gyu;Hong, Chae-Seon;Huh, Woong;Kim, Min-Kyu;Han, Young-Yih;Shin, Eun-Hyuk;Shin, Jung-Suk;Kim, Jing-Sung;Park, Hee-Chul;Ahn, Sung-Hwan;Lim, Do-Hoon;Choi, Doo-Ho
    • Progress in Medical Physics
    • /
    • v.23 no.2
    • /
    • pp.91-98
    • /
    • 2012
  • Verification of internal organ motion during treatment and its feedback is essential to accurate dose delivery to the moving target. We developed an offline based internal organ motion verification system (IMVS) using cine EPID images and evaluated its accuracy and availability through phantom study. For verification of organ motion using live cine EPID images, a pattern matching algorithm using an internal surrogate, which is very distinguishable and represents organ motion in the treatment field, like diaphragm, was employed in the self-developed analysis software. For the system performance test, we developed a linear motion phantom, which consists of a human body shaped phantom with a fake tumor in the lung, linear motion cart, and control software. The phantom was operated with a motion of 2 cm at 4 sec per cycle and cine EPID images were obtained at a rate of 3.3 and 6.6 frames per sec (2 MU/frame) with $1,024{\times}768$ pixel counts in a linear accelerator (10 MVX). Organ motion of the target was tracked using self-developed analysis software. Results were compared with planned data of the motion phantom and data from the video image based tracking system (RPM, Varian, USA) using an external surrogate in order to evaluate its accuracy. For quantitative analysis, we analyzed correlation between two data sets in terms of average cycle (peak to peak), amplitude, and pattern (RMS, root mean square) of motion. Averages for the cycle of motion from IMVS and RPM system were $3.98{\pm}0.11$ (IMVS 3.3 fps), $4.005{\pm}0.001$ (IMVS 6.6 fps), and $3.95{\pm}0.02$ (RPM), respectively, and showed good agreement on real value (4 sec/cycle). Average of the amplitude of motion tracked by our system showed $1.85{\pm}0.02$ cm (3.3 fps) and $1.94{\pm}0.02$ cm (6.6 fps) as showed a slightly different value, 0.15 (7.5% error) and 0.06 (3% error) cm, respectively, compared with the actual value (2 cm), due to time resolution for image acquisition. In analysis of pattern of motion, the value of the RMS from the cine EPID image in 3.3 fps (0.1044) grew slightly compared with data from 6.6 fps (0.0480). The organ motion verification system using sequential cine EPID images with an internal surrogate showed good representation of its motion within 3% error in a preliminary phantom study. The system can be implemented for clinical purposes, which include organ motion verification during treatment, compared with 4D treatment planning data, and its feedback for accurate dose delivery to the moving target.

Information Privacy Concern in Context-Aware Personalized Services: Results of a Delphi Study

  • Lee, Yon-Nim;Kwon, Oh-Byung
    • Asia pacific journal of information systems
    • /
    • v.20 no.2
    • /
    • pp.63-86
    • /
    • 2010
  • Personalized services directly and indirectly acquire personal data, in part, to provide customers with higher-value services that are specifically context-relevant (such as place and time). Information technologies continue to mature and develop, providing greatly improved performance. Sensory networks and intelligent software can now obtain context data, and that is the cornerstone for providing personalized, context-specific services. Yet, the danger of overflowing personal information is increasing because the data retrieved by the sensors usually contains privacy information. Various technical characteristics of context-aware applications have more troubling implications for information privacy. In parallel with increasing use of context for service personalization, information privacy concerns have also increased such as an unrestricted availability of context information. Those privacy concerns are consistently regarded as a critical issue facing context-aware personalized service success. The entire field of information privacy is growing as an important area of research, with many new definitions and terminologies, because of a need for a better understanding of information privacy concepts. Especially, it requires that the factors of information privacy should be revised according to the characteristics of new technologies. However, previous information privacy factors of context-aware applications have at least two shortcomings. First, there has been little overview of the technology characteristics of context-aware computing. Existing studies have only focused on a small subset of the technical characteristics of context-aware computing. Therefore, there has not been a mutually exclusive set of factors that uniquely and completely describe information privacy on context-aware applications. Second, user survey has been widely used to identify factors of information privacy in most studies despite the limitation of users' knowledge and experiences about context-aware computing technology. To date, since context-aware services have not been widely deployed on a commercial scale yet, only very few people have prior experiences with context-aware personalized services. It is difficult to build users' knowledge about context-aware technology even by increasing their understanding in various ways: scenarios, pictures, flash animation, etc. Nevertheless, conducting a survey, assuming that the participants have sufficient experience or understanding about the technologies shown in the survey, may not be absolutely valid. Moreover, some surveys are based solely on simplifying and hence unrealistic assumptions (e.g., they only consider location information as a context data). A better understanding of information privacy concern in context-aware personalized services is highly needed. Hence, the purpose of this paper is to identify a generic set of factors for elemental information privacy concern in context-aware personalized services and to develop a rank-order list of information privacy concern factors. We consider overall technology characteristics to establish a mutually exclusive set of factors. A Delphi survey, a rigorous data collection method, was deployed to obtain a reliable opinion from the experts and to produce a rank-order list. It, therefore, lends itself well to obtaining a set of universal factors of information privacy concern and its priority. An international panel of researchers and practitioners who have the expertise in privacy and context-aware system fields were involved in our research. Delphi rounds formatting will faithfully follow the procedure for the Delphi study proposed by Okoli and Pawlowski. This will involve three general rounds: (1) brainstorming for important factors; (2) narrowing down the original list to the most important ones; and (3) ranking the list of important factors. For this round only, experts were treated as individuals, not panels. Adapted from Okoli and Pawlowski, we outlined the process of administrating the study. We performed three rounds. In the first and second rounds of the Delphi questionnaire, we gathered a set of exclusive factors for information privacy concern in context-aware personalized services. The respondents were asked to provide at least five main factors for the most appropriate understanding of the information privacy concern in the first round. To do so, some of the main factors found in the literature were presented to the participants. The second round of the questionnaire discussed the main factor provided in the first round, fleshed out with relevant sub-factors. Respondents were then requested to evaluate each sub factor's suitability against the corresponding main factors to determine the final sub-factors from the candidate factors. The sub-factors were found from the literature survey. Final factors selected by over 50% of experts. In the third round, a list of factors with corresponding questions was provided, and the respondents were requested to assess the importance of each main factor and its corresponding sub factors. Finally, we calculated the mean rank of each item to make a final result. While analyzing the data, we focused on group consensus rather than individual insistence. To do so, a concordance analysis, which measures the consistency of the experts' responses over successive rounds of the Delphi, was adopted during the survey process. As a result, experts reported that context data collection and high identifiable level of identical data are the most important factor in the main factors and sub factors, respectively. Additional important sub-factors included diverse types of context data collected, tracking and recording functionalities, and embedded and disappeared sensor devices. The average score of each factor is very useful for future context-aware personalized service development in the view of the information privacy. The final factors have the following differences comparing to those proposed in other studies. First, the concern factors differ from existing studies, which are based on privacy issues that may occur during the lifecycle of acquired user information. However, our study helped to clarify these sometimes vague issues by determining which privacy concern issues are viable based on specific technical characteristics in context-aware personalized services. Since a context-aware service differs in its technical characteristics compared to other services, we selected specific characteristics that had a higher potential to increase user's privacy concerns. Secondly, this study considered privacy issues in terms of service delivery and display that were almost overlooked in existing studies by introducing IPOS as the factor division. Lastly, in each factor, it correlated the level of importance with professionals' opinions as to what extent users have privacy concerns. The reason that it did not select the traditional method questionnaire at that time is that context-aware personalized service considered the absolute lack in understanding and experience of users with new technology. For understanding users' privacy concerns, professionals in the Delphi questionnaire process selected context data collection, tracking and recording, and sensory network as the most important factors among technological characteristics of context-aware personalized services. In the creation of a context-aware personalized services, this study demonstrates the importance and relevance of determining an optimal methodology, and which technologies and in what sequence are needed, to acquire what types of users' context information. Most studies focus on which services and systems should be provided and developed by utilizing context information on the supposition, along with the development of context-aware technology. However, the results in this study show that, in terms of users' privacy, it is necessary to pay greater attention to the activities that acquire context information. To inspect the results in the evaluation of sub factor, additional studies would be necessary for approaches on reducing users' privacy concerns toward technological characteristics such as highly identifiable level of identical data, diverse types of context data collected, tracking and recording functionality, embedded and disappearing sensor devices. The factor ranked the next highest level of importance after input is a context-aware service delivery that is related to output. The results show that delivery and display showing services to users in a context-aware personalized services toward the anywhere-anytime-any device concept have been regarded as even more important than in previous computing environment. Considering the concern factors to develop context aware personalized services will help to increase service success rate and hopefully user acceptance for those services. Our future work will be to adopt these factors for qualifying context aware service development projects such as u-city development projects in terms of service quality and hence user acceptance.

Tool for Supporting Design Pattern-Oriented Software Development (디자인 패턴지향 소프트웨어 개발 지원 도구)

  • Kim, Woon-Yong;Choi, Young-Keun
    • Journal of KIISE:Software and Applications
    • /
    • v.29 no.8
    • /
    • pp.555-564
    • /
    • 2002
  • Design patterns are used to utilize well-defined design information. As using these design patterns, we can get re-use in object-oriented paradigm, decrease the time of development and improvement the quality of software. Although these design patterns are widely used among practice, most of design patterns information is manually used, inconsistent and its utilization could be very low. Because the design patterns information that a designer applies does not appear in software, it is sometimes difficult to track them. In this paper, we propose a tool support for design pattern-oriented software development. This tool supports design pattern management, software design and automatic source code generation. The design pattern management has the function for storing, managing and analyzing the existing design pattern and registering new design pattern. The software design has the function for software design with UML and automatically generate design pattern elements. By using this design information, this system can automatically generate source code. In the result to include the tracking design pattern element that is not Included In the existing CASE tools into design information, we can build the stable and efficient system that provides to analyse software, manage design pattern and automatically generate source code.