• Title/Summary/Keyword: Tool performance

Search Result 4,189, Processing Time 0.034 seconds

Numerical and Experimental Study on the Coal Reaction in an Entrained Flow Gasifier (습식분류층 석탄가스화기 수치해석 및 실험적 연구)

  • Kim, Hey-Suk;Choi, Seung-Hee;Hwang, Min-Jung;Song, Woo-Young;Shin, Mi-Soo;Jang, Dong-Soon;Yun, Sang-June;Choi, Young-Chan;Lee, Gae-Goo
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.32 no.2
    • /
    • pp.165-174
    • /
    • 2010
  • The numerical modeling of a coal gasification reaction occurring in an entrained flow coal gasifier is presented in this study. The purposes of this study are to develop a reliable evaluation method of coal gasifier not only for the basic design but also further system operation optimization using a CFD(Computational Fluid Dynamics) method. The coal gasification reaction consists of a series of reaction processes such as water evaporation, coal devolatilization, heterogeneous char reactions, and coal-off gaseous reaction in two-phase, turbulent and radiation participating media. Both numerical and experimental studies are made for the 1.0 ton/day entrained flow coal gasifier installed in the Korea Institute of Energy Research (KIER). The comprehensive computer program in this study is made basically using commercial CFD program by implementing several subroutines necessary for gasification process, which include Eddy-Breakup model together with the harmonic mean approach for turbulent reaction. Further Lagrangian approach in particle trajectory is adopted with the consideration of turbulent effect caused by the non-linearity of drag force, etc. The program developed is successfully evaluated against experimental data such as profiles of temperature and gaseous species concentration together with the cold gas efficiency. Further intensive investigation has been made in terms of the size distribution of pulverized coal particle, the slurry concentration, and the design parameters of gasifier. These parameters considered in this study are compared and evaluated each other through the calculated syngas production rate and cold gas efficiency, appearing to directly affect gasification performance. Considering the complexity of entrained coal gasification, even if the results of this study looks physically reasonable and consistent in parametric study, more efforts of elaborating modeling together with the systematic evaluation against experimental data are necessary for the development of an reliable design tool using CFD method.

A Numerical and Experimental Study for Fry-drying of Various Sludge (슬러지 유중 건조에 대한 전산 해석 및 실험적 연구)

  • Shin, Mi-Soo;Kim, Hey-Suk;Kim, Byeong-Gap;Hwang, Min-Jeong;Jang, Dong-Soon;Ohm, Tae-In
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.32 no.4
    • /
    • pp.341-348
    • /
    • 2010
  • The basic principle of fry drying process of sludge lies in the rapid pressure change of sludge material caused by the change of temperature between oil and moisture due to the difference of specific heat. Therefore, the rapid increase of pressure in drying sludge induces the efficient moisture escape through sludge pores toward heating oil media. The object of this study is to carry out a systematic investigation of the influence of various parameters associated with the sludge fry drying processes on the drying efficiency. To this end, a series of parametric experimental investigation has been made together with the numerical calculation in order to obtain typical drying curves as function of important parameters such as drying temperature, sludge diameter, oil type and sludge type. In the aspect of frying temperature, especially it is found that the operation higher than $140^{\circ}C$ was favorable in drying efficiency regardless of type of waste oil employed in this study. The same result was also noted consistently in the investigation of numerical calculation, that is, in that the sludge particle drying was efficiently made over $140^{\circ}C$ irrespective of the change of particle diameter. As expected, in general, the decrease of diameter in sludge was found efficient both experiment and numerical calculation in drying due to the increased surface area per unit volume. In the investigation of oil type and property, the effect of the viscosity of waste oil was found to be more influential in drying performance. In particular, when the oil with high viscosity, a visible time delay was noticed in moisture evaporation especially in the early stage of drying. However, the effect of high viscosity decreased significantly over the temperature of $140^{\circ}C$. There was no visible difference observed in the study of sludge type but the sewage sludge with a slightly better efficiency. The numerical study is considered to be a quite useful tool to assist in experiment with more detailed empirical modeling as further work.

The Records and Archives Administrative Reform in China in 1930s (1930년대 중국 문서당안 행정개혁론의 이해)

  • Lee, Won-Kyu
    • The Korean Journal of Archival Studies
    • /
    • no.10
    • /
    • pp.276-322
    • /
    • 2004
  • Historical interest in China in 1930s has been mostly focused on political characteristic of the National Government(國民政府) which was established by the KMT(中國國民黨) as a result of national unification. It is certain that China had a chance to construct a modern country by the establishment of the very unified revolutionary government. But, it was the time of expanding national crises that threatened the existence of the country such as the Manchurian Incident and the Chinese-Japanese War as well as the chaos of the domestic situation, too. So it has a good reason to examine the characteristic and pattern of the response of the political powers of those days. But, as shown in the recent studies, the manifestation method of political power by the revolutionary regime catches our attention through the understanding of internal operating system. Though this writing started from the fact that the Nationalist Government executed the administrative reform which aimed at "administrative efficiency" in the middle of 1930s, but it put stress on the seriousness of the problem and its solution rather than political background or results. "Committee on Administrative Efficiency(行政效率委員會)", the center of administrative reform movement which was established in 1934, examined the plan to execute the reform through legislation by the Executive Council(行政院) on the basis of the results of relevant studies. They claimed that the construction of a modern country should be performed by not political revolution anymore but by gradual improvement and daily reform, and that the operation of the government should become modern, scientific and efficient. There were many fields of administrative reform subjects, but especially, the field of records and archives adminstration(文書檔案行政) was studied intensively from the initial stage because that subject had already been discussed intensively. They recognized that records and archives were the basic tool of work performance and general activity but an inefficient field in spite of many input staff members, and most of all, archival reform bring about less conflicts than the fields of finance, organization and personnel. When it comes to the field of records adminstration, the key subjects that records should be written simply, the process of record treatment should be clear and the delay of that should be prevented were already presented in a records administrative meeting in 1922. That is, the unified law about record management was not established, so each government organization followed a conventional custom or performed independent improvement. It was through the other records administrative workshop of the Nationalist Government in 1933 when the new trend was appeared as the unified system improvement. They decided to unify the format of official records, to use marker and section, to unify the registration of receipt records and dispatch records and to strengthen the examination of records treatment. But, the method of records treatment was not unified yet, so the key point of records administrative reform was to establish a unified and standard record management system for preventing repetition by simplifying the treatment procedure and for intensive treatment by exclusive organizations. From the foundation of the Republic of China to 1930s, there was not big change in the field of archives administration, and archives management methods were prescribed differently even in the same section as well as same department. Therefore, the point at issue was to centralize scattered management systems that were performed in each section, to establish unified standard about filing and retention period allowance and to improve searching system through classification and proper number allowance. Especially, the problem was that each number system and classification system bring about different result due to dual operation of record registration and archives registration, and that strict management through mutual contrast, searching and application are impossible. Besides, various problems such as filing tools, arrangement method, preservation facilities & equipment, lending service and use method were raised also. In the process this study for the system improvement of records and archives management, they recognized that records and archives are the identical thing and reached to create a successive management method of records and archives called "Records and Archives Chain Management Method(文書檔案連鎖法)" as a potential alternative. Several principles that records and archives management should be performed unitedly in each organization by the general record recipient section and the general archives section under the principle of task centralization, a consistent classification system should be used by classification method decided in advance according to organizational constitution and work functions and an identical number system should be used in the process of record management stage and archive management stage by using a card-type register were established. Though, this "Records and Archives Chain Management Method" was developed to the stage of test application in several organizations, but it was not adopted as a regular system and discontinued. That was because the administrative reform of the Nationalist Government was discontinued by the outbreak of the Chinese-Japanese War. Even though the administrative reform in the middle of 1930s didn't produce practical results but merely an experimentation, it was verified that the reform against tradition and custom conducted by the Nationalist Government that aimed for the construction of a modern country was not only a field of politics, but on the other hand, the weak basis of the government operation became the obstacle to the realization of the political power of the revolutionary regime. Though the subject of records and archives administrative reform was postponed to the future, it should be understood that the consciousness of modern records and archives administration and overall studies began through this examination of administrative reform.

A Study on the Strategy of IoT Industry Development in the 4th Industrial Revolution: Focusing on the direction of business model innovation (4차 산업혁명 시대의 사물인터넷 산업 발전전략에 관한 연구: 기업측면의 비즈니스 모델혁신 방향을 중심으로)

  • Joeng, Min Eui;Yu, Song-Jin
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.2
    • /
    • pp.57-75
    • /
    • 2019
  • In this paper, we conducted a study focusing on the innovation direction of the documentary model on the Internet of Things industry, which is the most actively industrialized among the core technologies of the 4th Industrial Revolution. Policy, economic, social, and technical issues were derived using PEST analysis for global trend analysis. It also presented future prospects for the Internet of Things industry of ICT-related global research institutes such as Gartner and International Data Corporation. Global research institutes predicted that competition in network technologies will be an issue for industrial Internet (IIoST) and IoT (Internet of Things) based on infrastructure and platforms. As a result of the PEST analysis, developed countries are pushing policies to respond to the fourth industrial revolution through cooperation of private (business/ research institutes) led by the government. It was also in the process of expanding related R&D budgets and establishing related policies in South Korea. On the economic side, the growth tax of the related industries (based on the aggregate value of the market) and the performance of the entity were reviewed. The growth of industries related to the fourth industrial revolution in advanced countries overseas was found to be faster than other industries, while in Korea, the growth of the "technical hardware and equipment" and "communication service" sectors was relatively low among industries related to the fourth industrial revolution. On the social side, it is expected to cause enormous ripple effects across society, largely due to changes in technology and industrial structure, changes in employment structure, changes in job volume, etc. On the technical side, changes were taking place in each industry, representing the health and medical sectors and manufacturing sectors, which were rapidly changing as they merged with the technology of the Fourth Industrial Revolution. In this paper, various management methodologies for innovation of existing business model were reviewed to cope with rapidly changing industrial environment due to the fourth industrial revolution. In addition, four criteria were established to select a management model to cope with the new business environment: 'Applicability', 'Agility', 'Diversity' and 'Connectivity'. The expert survey results in an AHP analysis showing that Business Model Canvas is best suited for business model innovation methodology. The results showed very high importance, 42.5 percent in terms of "Applicability", 48.1 percent in terms of "Agility", 47.6 percent in terms of "diversity" and 42.9 percent in terms of "connectivity." Thus, it was selected as a model that could be diversely applied according to the industrial ecology and paradigm shift. Business Model Canvas is a relatively recent management strategy that identifies the value of a business model through a nine-block approach as a methodology for business model innovation. It identifies the value of a business model through nine block approaches and covers the four key areas of business: customer, order, infrastructure, and business feasibility analysis. In the paper, the expansion and application direction of the nine blocks were presented from the perspective of the IoT company (ICT). In conclusion, the discussion of which Business Model Canvas models will be applied in the ICT convergence industry is described. Based on the nine blocks, if appropriate applications are carried out to suit the characteristics of the target company, various applications are possible, such as integration and removal of five blocks, seven blocks and so on, and segmentation of blocks that fit the characteristics. Future research needs to develop customized business innovation methodologies for Internet of Things companies, or those that are performing Internet-based services. In addition, in this study, the Business Model Canvas model was derived from expert opinion as a useful tool for innovation. For the expansion and demonstration of the research, a study on the usability of presenting detailed implementation strategies, such as various model application cases and application models for actual companies, is needed.

A Study on the Development of Assessment Index for Catastrophic Incident Warning Sign at Refinery and Pertrochemical Plants (정유 및 석유화학플랜트 중대사고 전조신호 평가지표 개발에 관한 연구)

  • Yun, Yong Jin;Park, Dal Jae
    • Korean Chemical Engineering Research
    • /
    • v.57 no.5
    • /
    • pp.637-651
    • /
    • 2019
  • In the event of a major accident such as an explosion in a refinery or a petrochemical plant, it has caused a serious loss of life and property and has had a great impact on the insurance market. In the case of catastrophic incidents occurring in process industries such as refinery and petrochemical plants, only the proximate causes of loss have been drawn and studied from inspectors or claims adjustors responsible for claims of property insurers, incident cause investigators, and national forensic service workers. However, it has not been done well for conducting root cause analysis (RCA) and identifying the factors that contributed to the failure and establishing preventive measures before leading to chemical plant's catastrophic incidents. In this study, the criteria of warning signs on CCPS catastrophic incident waning sign self-assessment tool which was derived through the RCA method and the contribution factor analysis method using the swiss cheese model principle has been reviewed first. Secondly, in order to determine the major incident warning signs in an actual chemical plant, 614 recommendations which have been issued during last the 17 years by loss control engineers of global reinsurers were analyzed. Finally, in order to facilitate the assessment index for catastrophic incident warning signs, the criteria for the catastrophic incident warning sign index at chemical plants were grouped by type and classified into upper category and lower category. Then, a catastrophic incident warning sign index for a chemical plant was developed using the weighted values of each category derived by applying the analytic hierarchy process (pairwise comparison method) through a questionnaire answered by relevant experts of the chemical plant. It is expected that the final 'assessment index for catastrophic incident warning signs' can be utilized by the refinery and petrochemical plant's internal as well as external auditors to assess vulnerability levels related to incident warning signs, and identify the elements of incident warning signs that need to be tracked and managed to prevent the occurrence of serious incidents in the future.

Development of Digital Transceiver Unit for 5G Optical Repeater (5G 광중계기 구동을 위한 디지털 송수신 유닛 설계)

  • Min, Kyoung-Ok;Lee, Seung-Ho
    • Journal of IKEEE
    • /
    • v.25 no.1
    • /
    • pp.156-167
    • /
    • 2021
  • In this paper, we propose a digital transceiver unit design for in-building of 5G optical repeaters that extends the coverage of 5G mobile communication network services and connects to a stable wireless network in a building. The digital transceiver unit for driving the proposed 5G optical repeater is composed of 4 blocks: a signal processing unit, an RF transceiver unit, an optical input/output unit, and a clock generation unit. The signal processing unit plays an important role, such as a combination of a basic operation of the CPRI interface, a 4-channel antenna signal, and response to external control commands. It also transmits and receives high-quality IQ data through the JESD204B interface. CFR and DPD blocks operate to protect the power amplifier. The RF transmitter/receiver converts the RF signal received from the antenna to AD, is transmitted to the signal processing unit through the JESD204B interface, and DA converts the digital signal transmitted from the signal processing unit to the JESD204B interface and transmits the RF signal to the antenna. The optical input/output unit converts an electric signal into an optical signal and transmits it, and converts the optical signal into an electric signal and receives it. The clock generator suppresses jitter of the synchronous clock supplied from the CPRI interface of the optical input/output unit, and supplies a stable synchronous clock to the signal processing unit and the RF transceiver. Before CPRI connection, a local clock is supplied to operate in a CPRI connection ready state. XCZU9CG-2FFVC900I of Xilinx's MPSoC series was used to evaluate the accuracy of the digital transceiver unit for driving the 5G optical repeater proposed in this paper, and Vivado 2018.3 was used as the design tool. The 5G optical repeater digital transceiver unit proposed in this paper converts the 5G RF signal input to the ADC into digital and transmits it to the JIG through CPRI and outputs the downlink data signal received from the JIG through the CPRI to the DAC. And evaluated the performance. The experimental results showed that flatness, Return Loss, Channel Power, ACLR, EVM, Frequency Error, etc. exceeded the target set value.

A Systematic Review of Developmental Coordination Disorders in South Korea: Evaluation and Intervention (국내의 발달성협응장애(DCD) 연구에 관한 체계적 고찰 : 평가와 중재접근 중심으로)

  • Kim, Min Joo;Choi, Jeong-Sil
    • The Journal of Korean Academy of Sensory Integration
    • /
    • v.19 no.1
    • /
    • pp.69-82
    • /
    • 2021
  • Objective : This recent work intended to provide basic information for researchers and practitioners related to occupational therapy about Developmental Coordination Disorder (DCD) in South Korea. The previous research of screening DCD and the effects of intervention programs were reviewed. Methods : Peer-reviewed papers relating to DCD and published in Korea from January 1990 to December 2020 were systematically reviewed. The search terms "developmental coordination disorder," "development coordination," and "developmental coordination" were used to identify previous Korean research in this area from three representation database, the Research Information Sharing Service, Korean Studies Information Service System, and Google Scholar. We found a total of 4,878 articles identified through the three search engines and selected seventeen articles for analysis after removing those that corresponded to the overlapping or exclusion criteria. We adopted "the conceptual model" to analyze the selected articles about DCD assessment and intervention. Results : We found that twelve of the 17 studies showed the qualitative level of Level 2 using non-randomized approach between the two groups. The Movement Assessment Battery for Children and its second edition were the most frequently used tools in assessing children for DCD. Among the intervention studies, the eight articles (47%) were adopted a dynamic systems approach; a normative functional skill framework and cognitive neuroscience were each used in 18% of the pieces; and 11% of the articles were applied neurodevelopmental theory. Only one article was used a combination approach of normative functional skill and general abilities. These papers were mainly focused on the movement characteristics of children with DCD and the intervention effect of exercise or sports programs. Conclusion : Most of the reviewed studies investigated the movement characteristics of DCD or explore the effectiveness of particular intervention programs. In the future, it would be useful to investigate the feasibility of different assessment tools and to establish the effectiveness of various interventions used in rehabilitation for better motor performance in children with DCD.

Analysis of Intervention in Activities of Daily Living for Stroke Patients in Korea: Focusing on Single-Subject Research Design (국내 뇌졸중 환자를 대상으로 한 일상생활활동 중재 연구 분석: 단일대상연구 설계를 중심으로)

  • Sung, Ji-Young;Choi, Yoo-Im
    • Therapeutic Science for Rehabilitation
    • /
    • v.13 no.1
    • /
    • pp.9-21
    • /
    • 2024
  • Objective : The purpose of this study was to confirm the characteristics and quality of a single-subject research that conducted interventions to improve activities of daily living (ADL) in stroke patients. Methods : 'Stroke,' 'activities of daily living,' and 'single-subject studies' were searched as keywords among papers published in the last 15 years between 2009 and 2023 among Research Information Sharing Service, DBpia, and e-articles. A total of nine papers were examined for the characteristics and quality before analysis. Results : The independent variables applied to improve ADL included constraint-induced therapy, mental practice for performing functional activities, virtual reality-based task training, subjective postural vertical training without visual feedback, bilateral upper limb movement, core stability training program, traditional occupational therapy and neurocognitive rehabilitation, smooth pursuit eye movement, neck muscle vibration, and occupation-based community rehabilitation. Assessment of Motor and Process Skills was the most common evaluation tool for measuring dependent variables, with four articles, and Modified Barthel Index and Canadian Occupational Performance Measure were two articles each. As a result of confirming the qualitative level of the analyzed papers, out of a total of nine studies, seven studies were at a high level, two at a moderate level, and none were at a low level. Conclusion : Various types of rehabilitation treatments have been actively applied as intervention methods to improve the daily life activities of stroke patients; the quality level of single-subject studies applying ADL interventions was reliable.

Characteristics and Implications of Sports Content Business of Big Tech Platform Companies : Focusing on Amazon.com (빅테크 플랫폼 기업의 스포츠콘텐츠 사업의 특징과 시사점 : 아마존을 중심으로)

  • Shin, Jae-hyoo
    • Journal of Venture Innovation
    • /
    • v.7 no.1
    • /
    • pp.1-15
    • /
    • 2024
  • This study aims to elucidate the characteristics of big tech platform companies' sports content business in an environment of rapid digital transformation. Specifically, this study examines the market structure of big tech platform companies with a focus on Amazon, revealing the role of sports content within this structure through an analysis of Amazon's sports marketing business and provides an outlook on the sports content business of big tech platform companies. Based on two-sided market platform business models, big tech platform companies incorporate sports content as a strategy to enhance the value of their platforms. Therefore, sports content is used as a tool to enhance the value of their platforms and to consolidate their monopoly position by maximizing profits by increasing the synergy of platform ecosystems such as infrastructure. Amazon acquires popular live sports broadcasting rights on a continental or national basis and supplies them to its platforms, which not only increases the number of new customers and purchasing effects, but also provides IT solution services to sports organizations and teams while planning and supplying various promotional contents, thus creates synergy across Amazon's platforms including its advertising business. Amazon also expands its business opportunities and increases its overall value by supplying live sports contents to Amazon Prime Video and Amazon Prime, providing technical services to various stakeholders through Amazon Web Services, and offering Amazon Marketing Cloud services for analyzing and predicting advertisers' advertising and marketing performance. This gives rise to a new paradigm in the sports marketing business in the digital era, stemming from the difference in market structure between big tech companies based on two-sided market platforms and legacy global companies based on one-sided markets. The core of this new model is a business through the development of various contents based on live sports streaming rights, and sports content marketing will become a major field of sports marketing along with traditional broadcasting rights and sponsorship. Big tech platform global companies such as Amazon, Apple, and Google have the potential to become new global sports marketing companies, and the current sports marketing and advertising companies, as well as teams and leagues, are facing both crises and opportunities.

Design and Implementation of MongoDB-based Unstructured Log Processing System over Cloud Computing Environment (클라우드 환경에서 MongoDB 기반의 비정형 로그 처리 시스템 설계 및 구현)

  • Kim, Myoungjin;Han, Seungho;Cui, Yun;Lee, Hanku
    • Journal of Internet Computing and Services
    • /
    • v.14 no.6
    • /
    • pp.71-84
    • /
    • 2013
  • Log data, which record the multitude of information created when operating computer systems, are utilized in many processes, from carrying out computer system inspection and process optimization to providing customized user optimization. In this paper, we propose a MongoDB-based unstructured log processing system in a cloud environment for processing the massive amount of log data of banks. Most of the log data generated during banking operations come from handling a client's business. Therefore, in order to gather, store, categorize, and analyze the log data generated while processing the client's business, a separate log data processing system needs to be established. However, the realization of flexible storage expansion functions for processing a massive amount of unstructured log data and executing a considerable number of functions to categorize and analyze the stored unstructured log data is difficult in existing computer environments. Thus, in this study, we use cloud computing technology to realize a cloud-based log data processing system for processing unstructured log data that are difficult to process using the existing computing infrastructure's analysis tools and management system. The proposed system uses the IaaS (Infrastructure as a Service) cloud environment to provide a flexible expansion of computing resources and includes the ability to flexibly expand resources such as storage space and memory under conditions such as extended storage or rapid increase in log data. Moreover, to overcome the processing limits of the existing analysis tool when a real-time analysis of the aggregated unstructured log data is required, the proposed system includes a Hadoop-based analysis module for quick and reliable parallel-distributed processing of the massive amount of log data. Furthermore, because the HDFS (Hadoop Distributed File System) stores data by generating copies of the block units of the aggregated log data, the proposed system offers automatic restore functions for the system to continually operate after it recovers from a malfunction. Finally, by establishing a distributed database using the NoSQL-based Mongo DB, the proposed system provides methods of effectively processing unstructured log data. Relational databases such as the MySQL databases have complex schemas that are inappropriate for processing unstructured log data. Further, strict schemas like those of relational databases cannot expand nodes in the case wherein the stored data are distributed to various nodes when the amount of data rapidly increases. NoSQL does not provide the complex computations that relational databases may provide but can easily expand the database through node dispersion when the amount of data increases rapidly; it is a non-relational database with an appropriate structure for processing unstructured data. The data models of the NoSQL are usually classified as Key-Value, column-oriented, and document-oriented types. Of these, the representative document-oriented data model, MongoDB, which has a free schema structure, is used in the proposed system. MongoDB is introduced to the proposed system because it makes it easy to process unstructured log data through a flexible schema structure, facilitates flexible node expansion when the amount of data is rapidly increasing, and provides an Auto-Sharding function that automatically expands storage. The proposed system is composed of a log collector module, a log graph generator module, a MongoDB module, a Hadoop-based analysis module, and a MySQL module. When the log data generated over the entire client business process of each bank are sent to the cloud server, the log collector module collects and classifies data according to the type of log data and distributes it to the MongoDB module and the MySQL module. The log graph generator module generates the results of the log analysis of the MongoDB module, Hadoop-based analysis module, and the MySQL module per analysis time and type of the aggregated log data, and provides them to the user through a web interface. Log data that require a real-time log data analysis are stored in the MySQL module and provided real-time by the log graph generator module. The aggregated log data per unit time are stored in the MongoDB module and plotted in a graph according to the user's various analysis conditions. The aggregated log data in the MongoDB module are parallel-distributed and processed by the Hadoop-based analysis module. A comparative evaluation is carried out against a log data processing system that uses only MySQL for inserting log data and estimating query performance; this evaluation proves the proposed system's superiority. Moreover, an optimal chunk size is confirmed through the log data insert performance evaluation of MongoDB for various chunk sizes.