Preoperative Assessment of Renal Sinus Invasion by Renal Cell Carcinoma according to Tumor Complexity and Imaging Features in Patients Undergoing Radical Nephrectomy
-
- Korean Journal of Radiology
- /
- v.22 no.8
- /
- pp.1323-1331
- /
- 2021
Objective: To identify the association between renal tumor complexity and pathologic renal sinus invasion (RSI) and evaluate the usefulness of computed tomography tumor features for predicting RSI in patients with renal cell carcinoma (RCC). Materials and Methods: This retrospective study included 276 consecutive patients who underwent radical nephrectomy for RCC with a size of ≤ 7 cm between January 2014 and October 2017. Tumor complexity and anatomical renal sinus involvement were evaluated using two standardized scoring systems: the radius (R), exophytic or endophytic (E), nearness to collecting system or sinus (N), anterior or posterior (A), and location relative to polar lines (RENAL) nephrometry and preoperative aspects and dimensions used for anatomical classification (PADUA) system. CT-based tumor features, including shape, enhancement pattern, margin at the interface of the renal sinus (smooth vs. non-smooth), and finger-like projection of the mass, were also assessed by two independent radiologists. Univariable and multivariable logistic regression analyses were performed to identify significant predictors of RSI. The positive predictive value, negative predictive value (NPV), accuracy of anatomical renal sinus involvement, and tumor features were evaluated. Results: Eighty-one of 276 patients (29.3%) demonstrated RSI. Among highly complex tumors (RENAL or PADUA score ≥ 10), the frequencies of RSI were 42.4% (39/92) and 38.0% (71/187) using RENAL and PADUA scores, respectively. Multivariable analysis showed that a non-smooth margin and the presence of a finger-like projection were significant predictors of RSI. Anatomical renal sinus involvement showed high NPVs (91.7% and 95.2%) but low accuracy (40.2% and 43.1%) for RSI, whereas the presence of a non-smooth margin or finger-like projection demonstrated comparably high NPVs (90.0% and 91.3% for both readers) and improved accuracy (67.0% and 73.9%, respectively). Conclusion: A non-smooth margin or the presence of a finger-like projection can be used as a preoperative CT-based tumor feature for predicting RSI in patients with RCC.
Purpose As part of ongoing efforts to improve the current e-learning center, a survey was conducted regarding user experience and satisfaction to identify areas of improvement. Materials and Methods Radiologists (n = 454/617) and radiology residents (n = 163/617) of the Korean Society of Radiology were asked to answer a survey via email. The questionnaire asked for basic user information as well as user experiences relating to the e-learning center, such as workplace, frequency of use, overall satisfaction levels, reasons for satisfaction or dissatisfaction, and other suggestions for improvement. Results Annual members and all members of the e-learning center reported above average satisfaction levels of 67% and 42%, respectively. Approximately 30% of respondents viewed e-learning center lectures more than 5 times a month, with residents having a particularly high usage frequency. There was a high demand for additional lectures covering more diverse specialties (e-learning for annual members only: n = 28/97, e-learning for all members: n = 72/166), a smoother and more convenient searching platform/interface (n = 37/97 and n = 58/166, respectively), and regular content updates. In addition, many of the members suggested the addition of user-friendly functions such as playback speed control, a way to save viewing history, as well as requests for improved system stability. Conclusion Based on survey results, the educational committee plans to continue its efforts to improve the e-learning center by increasing the quality and quantity of available lectures, and increasing technical support to improve the stability and convenience of the e-learning digital system.
This study aims to train and implement a deep learning model for the fusion of website creation and artificial intelligence, in the era known as the AI revolution following the launch of the ChatGPT service. The deep learning model was trained using 3,000 collected web page images, processed based on a system of component and layout classification. This process was divided into three stages. First, prior research on AI models was reviewed to select the most appropriate algorithm for the model we intended to implement. Second, suitable web page and paragraph images were collected, categorized, and processed. Third, the deep learning model was trained, and a serving interface was integrated to verify the actual outcomes of the model. This implemented model will be used to detect multiple paragraphs on a web page, analyzing the number of lines, elements, and features in each paragraph, and deriving meaningful data based on the classification system. This process is expected to evolve, enabling more precise analysis of web pages. Furthermore, it is anticipated that the development of precise analysis techniques will lay the groundwork for research into AI's capability to automatically generate perfect web pages.
Currently, lithium secondary batteries have been used as medium- or large-sized energy sources such as electric vehicles and energy storage system (ESS) due to their high energy and eco-friendly characteristics. Currently commercialized lithium secondary batteries do not fully meet the demands for high energy density and safety. Many studies on solid electrolytes are being conducted to satisfy these requirements. In order to commercialize a solid electrolyte, it is important to supplement the low ion conductivity and high interface resistance with an electrode compared to the organic liquid electrolyte. Therefore, in this study, oligo(3,4-ethylenedioxythiophene (EDOT)) is added to poly(vinyl alcohol) (PVA), which is a polymer matrix with ion conductivity and sticky characteristics, to decrease the interfacial resistance with the same type of polythiophene (PTh)-based electrode. In addition, the addition of porous silicon dioxide (SiO2) filler improves lithium salt dissociation ability and increases ionic conductivity. And the electrochemical stability of the solid electrolyte, which has been lowered due to additives, is improved by introducing a cross-linked structure using boric acid (BA).
Recent advances in large-scale data processing technologies such as big data, cloud computing, and artificial intelligence have increased the demand for high-performance storage devices in data centers and enterprise environments. In particular, the fast data response speed of storage devices is a key factor that determines the overall system performance. Solid state drives (SSDs) based on the Non-Volatile Memory Express (NVMe) interface are gaining traction, but new bottlenecks are emerging in the process of handling large data input and output requests from multiple hosts simultaneously. SSDs typically process host requests by sequentially stacking them in an internal queue. When long transfer length requests are processed first, shorter requests wait longer, increasing the average response time. To solve this problem, data transfer timeout and data partitioning methods have been proposed, but they do not provide a fundamental solution. In this paper, we propose a dual queue based scheduling scheme (DQBS), which manages the data transfer order based on the request order in one queue and the transfer length in the other queue. Then, the request time and transmission length are comprehensively considered to determine the efficient data transmission order. This enables the balanced processing of long and short requests, thus reducing the overall average response time. The simulation results show that the proposed method outperforms the existing sequential processing method. This study presents a scheduling technique that maximizes data transfer efficiency in a high-performance SSD environment, which is expected to contribute to the development of next-generation high-performance storage systems
Recently, the rapid progress of a number of standardized web technologies and the proliferation of web users in the world bring an explosive increase of producing and consuming information documents on the web. In addition, most companies have produced, shared, and managed a huge number of information documents that are needed to perform their businesses. They also have discretionally raked, stored and managed a number of web documents published on the web for their business. Along with this increase of information documents that should be managed in the companies, the need of a solution to locate information documents more accurately among a huge number of information sources have increased. In order to satisfy the need of accurate search, the market size of search engine solution market is becoming increasingly expended. The most important functionality among much functionality provided by search engine is to locate accurate information documents from a huge information sources. The major metric to evaluate the accuracy of search engine is relevance that consists of two measures, precision and recall. Precision is thought of as a measure of exactness, that is, what percentage of information considered as true answer are actually such, whereas recall is a measure of completeness, that is, what percentage of true answer are retrieved as such. These two measures can be used differently according to the applied domain. If we need to exhaustively search information such as patent documents and research papers, it is better to increase the recall. On the other hand, when the amount of information is small scale, it is better to increase precision. Most of existing web search engines typically uses a keyword search method that returns web documents including keywords which correspond to search words entered by a user. This method has a virtue of locating all web documents quickly, even though many search words are inputted. However, this method has a fundamental imitation of not considering search intention of a user, thereby retrieving irrelevant results as well as relevant ones. Thus, it takes additional time and effort to set relevant ones out from all results returned by a search engine. That is, keyword search method can increase recall, while it is difficult to locate web documents which a user actually want to find because it does not provide a means of understanding the intention of a user and reflecting it to a progress of searching information. Thus, this research suggests a new method of combining ontology-based search solution with core search functionalities provided by existing search engine solutions. The method enables a search engine to provide optimal search results by inferenceing the search intention of a user. To that end, we build an ontology which contains concepts and relationships among them in a specific domain. The ontology is used to inference synonyms of a set of search keywords inputted by a user, thereby making the search intention of the user reflected into the progress of searching information more actively compared to existing search engines. Based on the proposed method we implement a prototype search system and test the system in the patent domain where we experiment on searching relevant documents associated with a patent. The experiment shows that our system increases the both recall and precision in accuracy and augments the search productivity by using improved user interface that enables a user to interact with our search system effectively. In the future research, we will study a means of validating the better performance of our prototype system by comparing other search engine solution and will extend the applied domain into other domains for searching information such as portal.
As information communication technology developed we could check our blood pressure, pulsation electrocardiogram, SpO2 and blood test easily at home. To check our health at ordinary times is able though interlocking the house medical instrument with the wireless public data network This service will help the inconvenience to visit the hospital everytime and will save the individual's time and cost. In each house an organism data which is detected from the human body will be transmitted to the distance hospital and will be essentially applied through wireless public data network The medical information transmit system is utilized by wireless close range network It would transmit the obtained organism signal wirelessly from the personal device to the main center system in the hospital. Remote telemetry system is embodied by utilizing wireless media access protocol. The protocol is embodied by grafting CSMA/CA(Carrier Sense Multiple Access with Collision Avoidance) protocol falling mode which is standards from IEEE 802.11. Among the house care telemetry system which could measure blood pressure, pulsation, electrocardiogram, SpO2 the study embodies the ECC(electrocardiograph) measure part. It within the ECC function into the movable device and add 900㎒ band wireless public data interface. Then the aged, the patients even anyone in the house could obtain ECG and keep, record the data. It would be essential to control those who had a health-examination heart diseases or more complicated heart diseases and to observe the latent heart disease patient continuously. To embody the medical information transmit system which is based on wireless network. It would transmit the ECG data among the organism signal data which would be utilized by wireless network modem and NCL(Native Control Language) protocol to contact through wireless network Through the SCR(Standard Context Routing) protocol in the network it will be connected to the wired host computer. The computer will check the recorded individual information and the obtained ECC data then send the correspond examination to the movable device. The study suggests the medical transmit system model utilized by the wireless public data network.
The customer satisfaction of WAP service greatly relies on the usability of the service due to the limited display size of a mobile phone and limitation in realizing UI (User Interface) for function keys, browser, and OS (operating system) Currently, a number of contents providers develop and deliver varying services, and thus, it is critical to control quality level of UI in consistent standards and manner. This study suggests usability index evaluation system to achieve consistent UI quality control of various WAP services. The system adopts both top-down and bottom-up approaches. The former concerns deriving UI design components and evaluation checklists for the WAP, based on the usability attributes and UI principles. The latter concerns deriving usability-related evaluation checklists from the established UI design features, and then grouping them from the viewpoint of usability principles and attributes. This bidirectional approach has two outstanding advantages: it allows thorough examination of potential elements that can cause usability problems from the standpoint of usability attributes, and also derives specific evaluation elements from the perspective of UI design components that are relevant to the real service environment. The evaluation system constitutes a hierarchical structure by networking usability attributes, UI guideline which indicates usability principles for each attribute, and usability evaluation checklist for each UI component that enables specific evaluation. Especially, each evaluation checklist contains concrete contents and format so that it can be readily marked in O/X. The score is based on the ratio of number of items that received positive answer to the number of total items. This enables a quantitative evaluation of the usability of mobile WAP service. The validity of the proposed evaluation system has been proved through comparative analysis with the real usability problems based on the user test. A software was developed that provides guideline for evaluation objects, criteria and examples for each checklist, and automatically calculates a score. The software was applied to evaluating and improving the real mobile WAP service.
The wall shear stress in the vicinity of end-to end anastomoses under steady flow conditions was measured using a flush-mounted hot-film anemometer(FMHFA) probe. The experimental measurements were in good agreement with numerical results except in flow with low Reynolds numbers. The wall shear stress increased proximal to the anastomosis in flow from the Penrose tubing (simulating an artery) to the PTFE: graft. In flow from the PTFE graft to the Penrose tubing, low wall shear stress was observed distal to the anastomosis. Abnormal distributions of wall shear stress in the vicinity of the anastomosis, resulting from the compliance mismatch between the graft and the host artery, might be an important factor of ANFH formation and the graft failure. The present study suggests a correlation between regions of the low wall shear stress and the development of anastomotic neointimal fibrous hyperplasia(ANPH) in end-to-end anastomoses. 30523 T00401030523 ^x Air pressure decay(APD) rate and ultrafiltration rate(UFR) tests were performed on new and saline rinsed dialyzers as well as those roused in patients several times. C-DAK 4000 (Cordis Dow) and CF IS-11 (Baxter Travenol) reused dialyzers obtained from the dialysis clinic were used in the present study. The new dialyzers exhibited a relatively flat APD, whereas saline rinsed and reused dialyzers showed considerable amount of decay. C-DAH dialyzers had a larger APD(11.70
The wall shear stress in the vicinity of end-to end anastomoses under steady flow conditions was measured using a flush-mounted hot-film anemometer(FMHFA) probe. The experimental measurements were in good agreement with numerical results except in flow with low Reynolds numbers. The wall shear stress increased proximal to the anastomosis in flow from the Penrose tubing (simulating an artery) to the PTFE: graft. In flow from the PTFE graft to the Penrose tubing, low wall shear stress was observed distal to the anastomosis. Abnormal distributions of wall shear stress in the vicinity of the anastomosis, resulting from the compliance mismatch between the graft and the host artery, might be an important factor of ANFH formation and the graft failure. The present study suggests a correlation between regions of the low wall shear stress and the development of anastomotic neointimal fibrous hyperplasia(ANPH) in end-to-end anastomoses. 30523 T00401030523 ^x Air pressure decay(APD) rate and ultrafiltration rate(UFR) tests were performed on new and saline rinsed dialyzers as well as those roused in patients several times. C-DAK 4000 (Cordis Dow) and CF IS-11 (Baxter Travenol) reused dialyzers obtained from the dialysis clinic were used in the present study. The new dialyzers exhibited a relatively flat APD, whereas saline rinsed and reused dialyzers showed considerable amount of decay. C-DAH dialyzers had a larger APD(11.70