DOI QR코드

DOI QR Code

The Impact of Product Review Usefulness on the Digital Market Consumers Distribution

  • Seung-Yong LEE (College of General Education, Namseoul University) ;
  • Seung-wha (Andy) CHUNG (School of Business, Yonsei University) ;
  • Sun-Ju PARK (School of Business, Yonsei University)
  • Received : 2024.01.08
  • Accepted : 2024.03.05
  • Published : 2024.03.30

Abstract

Purpose: This study is a quantitative study and analyzes the effect of evaluating the extreme and usefulness of product reviews on sales performance by using text mining techniques based on product review big data. We investigate whether the perceived helpfulness of product reviews serves as a mediating factor in the impact of product review extremity on sales performance. Research design, data and methodology: The analysis emphasizes customer interaction factors associated with both product review helpfulness and sales performance. Out of the 8.26 million Amazon product reviews in the book category collected by He & McAuley (2016), text mining using natural language processing methodology was performed on 300,000 product reviews, and the hypothesis was verified through hierarchical regression analysis. Results: The extremity of product reviews exhibited a negative impact on the evaluation of helpfulness. And the helpfulness played a mediating role between the extremity of product reviews and sales performance. Conclusion: Increased inclusion of extreme content in the product review's text correlates with a diminished evaluation of helpfulness. The evaluation of helpfulness exerts a negative mediating effect on sales performance. This study offers empirical insights for digital market distributors and sellers, contributing to the research field related to product reviews based on review ratings.

Keywords

1. Introduction

1.1. Customer Engagement and Distribution Efficacy

Distribution constitutes an operational endeavor that engenders the utility of place, time, and ownership, facilitating the seamless transfer of goods and services from producers to consumers. As the destination of distribution is directed to consumers, distribution is occasionally explained within the realm of marketing, encompassing facets like brand reputation and consumer engagement (Thabit & Raewf, 2018). The efficacy of distribution is influenced by not solely the direct conveyance of goods (Rao et al., 2009) but also the effectiveness with which consumers make purchasing decisions.

The digital industry is disrupting many existing industries. Machine-oriented cars are transforming into digital devices, and mobile devices are becoming portable computers (Rahmati et al., 2021). Companies are also being asked for an integrated digital perspective that goes beyond the existing institutionalized process while introducing innovative digital technology into their products and operations (Svahn et al., 2017). The market is in a similar situation. The influence of the online market is growing especially with COVID-19, and customers in the online market are actively using blogs, online forums, and purchase reviews from digital markets to present their opinions (Shen et al., 2015).

In the online-based digital industry, product reviews are a very influential factor in sales performance, and they have unique characteristics that differ from the seller's information in that customers voluntarily write.

However, it is difficult to say that product reviews only have a positive effect on sales performance, and there is a lack of research on how the product reviews usefulness evaluation affect sales performance. Digital markets frequently place useful product reviews and if a useful product review negatively affects sales performance, such behavior can adversely affect sales performance, so research about this matter should be carried out promptly. The main purpose of this study is to contribute to improving the performance of the digital market and expanding related study through research on how the usefulness evaluation of product reviews affects sales performance.

Amazon, a prominent global digital marketplace, has achieved groundbreaking efficiency in distribution through the implementation of its fulfillment system. However, realizing this accomplishment would have posed challenges without a dedicated emphasis on customer service within Amazon's online platform (Prahalad & Ramaswamy, 2000). These days, the digital market actively endeavors to influence consumers' purchasing decisions by employing the sharing economy, such as the purchase review systems (Nadeem et al., 2021), while sellers cultivate product loyalty through strategic advertising on social media platforms (Rapp et al., 2013) for the efficacy of distribution. Favorable evaluations by consumers of a consumer support system can cultivate a positive reputation in the product purchase performance (Salehan & Kim, 2016).

1.2. Characteristics of the Digital Market

The digital market's share is expanding compared to the offline market, driven by the spatial and temporal convenience for buyers (Taken Smith, 2012; Bala & Verma, 2018; Kannan, 2017). Influenced by the recent COVID-19 pandemic, non-face-to-face technologies rapidly developed (Chakraborty & Maity, 2020; Donthu & Gustafsson, 2020) and are accelerating the shift of consumers toward the digital market (Gu et al., 2021; Hashem, 2020; Akram et al., 2021). Companies venturing into the digital market are confronted with the pressing need to delve into the intricacies of consumers' online purchasing decision-making processes by formulating innovative strategies that offer a competitive advantage (Kurdi et al., 2022; Gursoy et al., 2022).

Consumers in the digital market actively utilize various online channels for gathering information and assessing the value of products (Ghose & Ipeirotis, 2010; Ryan, 2016; Chaffey & Smith, 2013). In contrast to traditional markets where customers can physically evaluate products by seeing, smelling, and touching them (Degeratu et al., 2000; Levin et al., 2003; Füller et al., 2007), in online markets, limitations arise in comprehending the value of products mainly based on information available online. This limitation emphasizes the importance of the information provided by supplier or external experts, as well as insights by previous buyers (Kannan, 2017; Chaffey & Smith, 2013).

1.2.1. Information Distortion in Digital Market

But ensuring the objectivity of information in online channels poses challenges (Taiminen & Karjaluoto, 2015; Chaffey & Smith, 2013). Sellers in the digital market may selectively emphasize positive aspects while concealing negative information in product descriptions (Dimoka et al., 2012; Pavlou & Gefen, 2005; Granados et al., 2006). Comparison shopping websites such as Google shopping often highlight specific products received advertising fees (Olbrich & Schultz, 2014; Yang & Gose, 2010).

1.2.2. Objectivity of Product Review

In situations of information distortion, consumers actively rely on product reviews (Mudambi & Schuff, 2010; Sen & Lerman, 2007). Previous buyers typically share their experiences by voluntarily writing product reviews without any specific remuneration (Sen & Lerman, 2007; Park & Kim, 2008). Although individual product review may be subjective, product reviews are perceived relatively objective compared to the seller's product description (Ghose & Ipeirotis, 2010). Some potential buyers further enhance objectivity by assessing the helpfulness evaluation of product reviews (Decker & Trusov, 2010).

Researchers have investigated the impact of product reviews on sales performance (Kim et al., 2016). A comprehensive study has been conducted on the influence of product review ratings, representing a numerical evaluation of a product (Hu et al., 2014; Floyd et al., 2014; Moon et al., 2010). Perceived as objective, product review ratings are easily collected and analyzed in this research field (Sutton & Austin, 2015; Miles, 1979; Büschken & Allenby, 2016), and many studies have employed the mean or variance of the rating for main variable (De Langhe et al., 2016; Kopalle et al., 2017).

In this context, product review data from Amazon, the largest global digital market platform, has been used for comprehensive research of digital market, encompassing quantitative analysis of ratings and big data-driven studies, including emotional analysis (Haque et al., 2018; Fang & Zhan, 2015; Feng et al., 2012; Danescu-Niculescu-Mizil et al., 2019).

This study focuses on the aspect that product reviews with high usefulness ratings, which are often placed in the front of the product review section in the digital market, may negatively affect sales performance. In addition, there is a difference in expanding product review extremity studies based on product review ratings by actively utilizing big data methodologies. To this end, this study introduced a methodology that analyzes Amazon big data through text mining. Since the usefulness evaluation of product reviews will not only be based on ratings, but will read the review text, we judged that text mining will be useful in understanding the characteristics of these review texts.

2. Theoretical Background

2.1. Literature on Product Review

2.1.1. Voluntary Characteristics of Product Reviews

With the proliferation of various social media platforms, many individuals who do not have any partnerships with product sellers provide voluntary meaningful product information on social media platforms such as blogs, YouTube, as well as Instagram, etc. (Flanagin & Metzger, 2008; Chadwick, 2007). The provision of such voluntary information of product has a significant impact on the sales performance of the digital market (Tiago & Veríssimo, 2014; Bala & Verma, 2018). Product reviews are recognized as having a positive impact on product sales (Tiago & Veríssimo, 2014; Ghose & Ipeirotis, 2010; Floyd et al., 2014). So, many digital market platforms seek to influence sales performance by providing opportunities for individuals can voluntarily produce information about their products(Killian & McManus, 2015; Lamberton & Stephen, 2016; Labrecque et al., 2013).

Product reviews containing evaluative comments provided by buyers after purchasing, are recognized as more objective than sellers’ information (Park et al., 2007; Reyes & Rosso, 2012).

2.1.2. Composition of Product Reviews

Product reviews in digital markets typically include titles, text, and ratings (Park et al., 2007), and sometimes have voting system to measure the helpfulness of the reviews (Ghose & Ipeirotis, 2007; Ghose & Ipeirotis, 2010). Potential buyers can evaluate product review helpfulness by their volume, average rating, and valuable insights from the content (Flanagin et al., 2014; Kreimeyer et al., 2017).

2.1.3. Research Areas of Product Reviews

Many studies have explored the impact of product reviews on corporate performance. Their main focus was on quantitative dimensions such as mean and standard deviation of ratings, and number of product reviews (Ghose & Ipeirotis, 2010; Hu et al., 2008; Sen & Lerman, 2007; Park et al., 2007; Dellarocas et al., 2007). Previous studies on quantitative aspects of product reviews have shown that higher mean ratings lead to higher sales performance (Chevalier & Mayzlin, 2006; Hu et al., 2008; Ye et al., 2009). Similar effects have been observed with the number of product reviews (Zhou et al., 2022).

Despite the product review text containing meaningful comments from previous buyers, limited research has been conducted due to difficulties in identifying the text contents.

2.2. Limitations of the Product Reviews

2.2.1. The Limitation of a 5-point Scale Rating

Many digital markets, such as Amazon, utilize 5-point scales for assessing product valuation ratings (Coelho & Esteves, 2007; Garland, 1991). Some arguments suggest that accurately identifying differences within these 5 scales may be very challenging (Roche et al., 2004). To address this, adopting a 7 or 10-point scale has been proposed (Finstad, 2010; Tarka, 2017). However, increasing scale size may reduce the proper respondent responses for the increasing response time (Wisner, 2020).

2.2.2. Bias in Ratings

Moreover, product review ratings exhibit a propensity of skewness (Mudambi & Schuff, 2010; Kuan et al., 2015; Croarkin et al., 2004). Review authors, typically buyers who have already made purchase decisions for that product, tend to provide positive reviews, influenced by both their information gathered during the decision-making process and a sense of self-efficacy towards the product (Lee et al., 2008; Sen & Lerman, 2007). Buyers may intentionally submit extreme product review to express their opinions clearly (Filieri et al., 2018; Mudambi & Schuff, 2010). It will also increase the bias of review ratings.

2.3. Analysis of Product Review Content

To enhance the objective evaluation of a product's value, it is essential to comprehensively understand the buyer's opinion by integrating additional information beyond the product review rating. While some product review authors provide only ratings, many of them actively express specific opinions in the review text. Analyzing the textual content of a product review proves highly beneficial evaluating the product's value (Hu et al., 2014; Fang et al., 2016; Patrick et al., 2011).

This study closely examines the elements of product reviews through big data analysis, and analyzes the impact of these factors on sales performance, which closely identifies interactions between customers.

2.3.1. Using Big Data Analysis Methodology

Advancements in big data analysis techniques including data mining, play a crucial role in overcoming the limitations of product review research centered on quantitative factors (Ghazal et al., 2013; Jeyapriya & Selvi, 2015). Big data analysis facilitates the identification of which products the buyers tend to purchase together, offering insights of their preferences(He & McAuley, 2016; Akter & Wamba, 2016; Chaffey & Smith, 2022). Text mining techniques, one of the prominent big data methodologies (Nadkarni et al., 2011; Hirschberg & Manning, 2015; Yi et al., 2003; Sun et al., 2017), employ natural language processing (NLP) techniques to identify the part of speech and language elements in the text (Nadkarni et al., 2011; Hirschberg & Manning, 2015; Yi et al., 2003; Sun et al., 2017). These big data methodologies enhance the scope of existing product review studies, which were previously constrained on the volume and ratings.

It remains complicated to sort out the content of the product reviews. Examining the extreme expressions in texts aids in comprehending opinion extremity. Recent advances in text mining techniques that can quantify words and parts of speech in a text, provide a means of identifying the effects of extremity in product evaluation (Owoputi et al., 2013). Since extreme expressions often incorporate superlative adjectives and superlative adverbs, scrutinizing the presence of these words in product reviews facilitates a nuanced understanding of extremities (Ravi & Ravi, 2015; Zhao et al., 2021).

2.3.2. Review Extremity and Helpfulness evaluation

Previous research has recognized that the extremity of product reviews influences on sales (Moon et al., 2014). There are varied perspectives among researchers regarding the extremity of product reviews. Some researchers argue that extreme reviews lack utility due to their perceived lack of persuasiveness (Mudambi & Schuff, 2010; Filieri, 2016), with some asserting that extreme product reviews are considered helpful (Filieri et al., 2018; Cao et al., 2011). Additionally, arguments suggest that these results depend on the empirical and exploratory nature of the data and the characteristics of the products (Mudambi & Schuff, 2010).

Studies examining the helpfulness of product reviews commonly analyze standard deviations of ratings (Eslami et al., 2018). Big data analysis can be leveraged to identify whether the text of a product review contains extreme expressions, facilitating a comprehensive understanding of the extreme product reviews impact on usefulness evaluations.

Potential buyers frequently cast votes for helpfulness after reviewing the text of a product review, rather than relying solely on numerical ratings. Extreme opinions on products will be considered unhelpful, as they may be perceived as deviating significantly from the actual value. The perception of extreme content in a product review can lead to a diminished helpfulness vote for the review.

In light of the above considerations, the hypothesis is formulated as follows.

H1: The more extreme terms are used in product review text; the rate of helpfulness will be decreased.

2.3.3. Review Helpfulness and Sales Performance

In assessing the helpfulness of product reviews and their association with sales performance, researchers contend that reviews tend to be more helpful when they are less extreme and contain substantial information (Mudambi & Schuff, 2010), which can negatively impact sales performance. Although longer product reviews may be perceived as helpful, they may also present both negative and positive information about the product, leading to increased uncertainty in purchasing decisions. Therefore, it has a potentially negative impact on sales performance (Chen et al., 2009). The extremity of the product review is known to influence decision-making diagnostically, with a positive effect on sales performance (Skowronski & Carlston, 1989). Given that extreme reviews are considered unhelpful (Mudambi & Schuff, 2010), the perceived helpfulness of reviews may have a relatively negative impact on sales volume.

In light of the above considerations, the hypothesis is formulated as follows.

H2: The rate of helpfulness of product reviews is expected to negatively impact on the sales performance.

2.2.4. Mediating Effects of Product Review Helpfulness

Despite numerous studies exploring the determinants of product review helpfulness, there is limited research on the impact of helpfulness on sales performance (Li et al., 2013; Krishnamoorthy, 2015). Product reviews considered helpful can influence the delay in purchasing decisions (Lee & Choeh, 2020). The substantial number of helpfulness votes can also be attributed to the increased interest of potential buyers. In situations where potential buyers are unable to read all of the product reviews, labeled as helpful reviews can gather more attention, influencing purchasing decisions. Consequently, the results of helpfulness votes, representing the evaluation of potential buyers, will exhibit a mediating effect on the impact of factors such as product review extremity on sales performance.

Research related to the usefulness of product reviews has mainly focused on the factors influencing the usefulness evaluation and the typical factors include length, readability, extreme rating, when created, and product characteristics (Wang et al., 2019; Kuan et al., 2015) but there are few studies on the relationship between performance and usefulness. In this study, the factors affecting the usefulness evaluation as well as the effect of the usefulness on sales performance were identified.

In light of the above considerations, the hypothesis is formulated as follows.

H3: The helpfulness vote will serve as a mediating factor in the impact of the extremity of the review text on sales performance.

3. Research Methods and Data

3.1. Defining and Measuring Variables

3.2.1. Dependent Variable

A company's performance is assessed through various indicators such as sales, net profit, product sales, and brand recognition (Kapfer, 2008; Murphy et al., 1996). However, many digital markets, including Amazon, do not disclose direct information about a sellers’ performance as trade secrets (Easterbrook, 1981; Glaeser, 2018). Amazon facilitates the identification of relative product sales volume by revealing sales rankings of each category, providing insights into sales.

Amazon’s sales ranking algorithm is known to depend on both current and cumulative sales (Sharma et al., 2020). Consequently, many studies on Amazon's performance utilize sales ranking as a significant indicator of sales performance (Mudambi & Schuff, 2010; Amblee & Bui, 2011). Therefore, in this study, the book category sales rankings released by Amazon sales and product review data (He & McAuley, 2016) is designated as the dependent variable.

3.2.2. Independent Variable

The independent variables in this study were defined as the extremity of the product review text. Measurement of the extremity of the product review text was conducted specifically by quantifying the mean presence of superlative adverbs and superlative adjectives in the product review text in accordance with established methodologies from prior research (Sun et al., 2017).

3.2.3. Mediator Variable

The mediating variable in this study was assessed through the evaluation of product review helpfulness. The helpfulness evaluation on Amazon's product review, featuring a voting function, was quantified as the mean percentage of respondents who voted the review helpful to the total number of voters for each product (Danescu-Niculescu-Mizil & Kossinets et al., 2009; Mudambi & Schuff, 2010).

3.2.3. Control Variables

The control variables in this study are defined as the mean and standard deviation (SD) of ratings (Dhanasobhon et al., 2007; Haque et al., 2018), price, review helpfulness vote Mean, review helpfulness yes mean, review helpfulness yes SD (Mudambi & Schuff, 2010; Kaushik et al., 2018), and review title extremity, review text mean, description words, review title mean, review title SD (Zhang et al., 2011; Kaushik et al., 2018) drawing upon prior research in the domain of product reviews.

3.2. Data

The empirical analysis in this study utilized product review data sourced from Amazon, a prominent global digital marketplace (Statista, 2021). The product review system on Amazon is structured to input ratings, summary, texts and real names. And it allows to vote the helpfulness of each review.

For this empirical study, Amazon product review data collected by He and McAuley (2016) was employed. This dataset includes information on the number of voters who assessed the helpfulness of reviews, a metric not publicly disclosed on Amazon's website these days, facilitating the computation of the helpfulness evaluation rate. The dataset spans 33 product categories, this study focuses specifically on the book category sold on Amazon since 1994.

The analysis covered 300,000 product reviews associated with 8,588 products. The subjects of this study were Amazon's product reviews, reviews, and ratings by 65,536 buyers. All product reviews are disclosed, so there are no restrictions on using them for research. The product reviews included in this study were prepared from November 20, 1996, to July 23, 2014.

It was challenging to confirm the general validity and reliability of using multiple questions by measuring each variable as a single question, but significant correlation was confirmed through correlation and regression analysis. Hypothesis verification of hypotheses such as mediating effects was performed through hierarchical multiple regression analysis.

4. Results

Hypothesis verification was conducted through regression analysis after preliminary examination of descriptive statistics and correlation analysis. The results of the analysis are presented below.

4.1. Descriptive Statistics and Correlation

Before conducting a comprehensive analysis, both descriptive statistics and correlation analysis were performed. In Table 1, the results of the correlation analysis revealed significant associations among the key variables. Review Text Extremities(β = -.114, p = .000) displayed a statistically significant negative correlation with Sales Ranking. However, the correlation between Review Helpfulness and Sales Ranking did not reach statistical significance.

Table 1: Technical statistics and correlations

OTGHB7_2024_v22n3_113_t0001.png 이미지

* n=8,588 (Number of products), n=300,000 (Number of product reviews), † p < .10, * p < .05, ** p < .01, *** p < .001, SD: Standard Deviation

Prior to conducting the comprehensive analysis, a preliminary assessment was conducted to ascertain the extent of multicollinearity among variables. The evaluation revealed that all tolerances exceeded 0.1, and the variance inflation factor (VIF) consistently remained below 8, indicating an absence of multicollinearity issues. As shown in Table 2, the assessment for multicollinearity revealed that all variables were within an acceptable range. Consequently, all variables were retained for further analysis.

Table 2: Multicollinearity of Variables

OTGHB7_2024_v22n3_113_t0002.png 이미지

* n=7,902(Number of products), R (.490), R2(.240), Adjusted R2(.239), F(191.493***), Dependent Variable: Sales Ranking

4.2. Review Extremity and Helpfulness: H1

Table 3 below presents the outcomes of the analysis examining the impact of product review text extremity on the evaluation of helpfulness.

Table 3: Review Text Extremity and Review Helpfulness

OTGHB7_2024_v22n3_113_t0003.png 이미지

* n=7,905 (Number of products), † p < .10, * p < .05, ** p < .01, *** p < .001, Dependent Variables: Review Helpfulness

In Table 3 Model 1, which exclusively incorporated all control variables exhibited a significant association with the dependent variable, Review Helpfulness.

Model 2 in Table 3, with the inclusion of independent variables, Review Text Extremity exhibited a statistically significant negative correlation with Review Helpfulness (β = -.042, p = .004). Consequently, Hypotheses 1 was supported.

4.3. Review Helpfulness and Sales Performance: H2

Table 4 presents the findings from the analysis investigating the impact of review helpfulness on sales ranking.

Table 4: Review Helpfulness and Sales Ranking

OTGHB7_2024_v22n3_113_t0004.png 이미지

* n=7,902 (Number of products), † p < .10, * p < .05, ** p < .01, *** p < .001, Dependent variable: Sales ranking

Model 1 in Table 4, incorporating solely control variables, revealed a significant association between all control variables and the dependent variable, Sales Ranking.

In Model 2, with the inclusion of additional independent variables, Review Helpfulness demonstrated a significant negative relationship with Sales Ranking (β = -.035, p = .001). Consequently, Hypothesis 2 was supported.

4.3. Mediating Effects of Helpfulness: H3

Baron and Kenny’s (1986) conditions for assessing mediating effects were employed to validate the mediating role of Review Helpfulness in the influence of Review Extremity on Sales Ranking. The initial condition necessitates a significant impact of the independent variable on the mediator, while the second condition mandates a significant effect of the independent variable on the dependent variable. The third condition requires a significant effect of the mediator on the dependent variable.

The first condition was assessed and supported through Hypothesis 1. For the second condition, Review Text Extremity (β = .031, p = .023) in Model 2 of Table 5 was found to have a statistically significant association with Sales Ranking. Confirming the third condition by inputting all variables in Model 3, all variables were statistically significant, indicating a mediating effect. As evident in Model 2 and Model 3, Review Helpfulness exhibited a partial mediating effect, with Review Text Extremities (from .031, p = .023 → .030, p = .030) maintaining statistical significance.

Table 5: Mediating Effects of Review Helpfulness

OTGHB7_2024_v22n3_113_t0005.png 이미지

* n=7,902 (Number of products), † p < .10, * p < .05, ** p < .01, *** p < .001, Dependent variable: Sales ranking

Sobel-test (Sobel, 1982) results also affirmed the significance of the mediating effect of Review Helpfulness.

In Table 6, ‘a’ represents the non-standardized coefficient in the regression analysis of independent variables and mediator variables, with ‘Sa’ denoting the standard error of ‘a.’ Meanwhile, ‘b’ signifies the non-standardized coefficient of mediator variables and dependent variables in a full regression model, and 'Sb' denotes the standard error of ‘b.’

Table 6: The Mediating Effect of Helpfulness by Sobel-test

OTGHB7_2024_v22n3_113_t0006.png 이미지

4.3. Discussion

As a result of empirical research, it was observed that the evaluation of the helpfulness of product reviews was influenced by the extremity of the product review text. The extremity of the product review text was measured by employing text mining techniques to quantify the occurrence of superlative adjectives and superlative adverbs. In the text mining process, the text was tokenized into words, and the count of superlative adjectives and superlative adverbs was identified through natural language processing (NLP) based on part of speech tagging. Due to time constraints, this process utilized a subset of 300,000 reviews from the total product review data for empirical analysis.

As a result of this study, it would be prudent to place the product reviews evaluated as highly useful on the digital market front, as the useful product reviews can negatively affect sales performance. In addition, this study will provide the basis for expanding the rating-oriented purchase review research to the text mining base and provide many implications for research related to the sales performance of the digital market.

Furthermore, an analysis was conducted to examine the relationship between superlative adjectives and adverbs in the text and rating extremity. The standard deviation of rating, commonly used in previous product review studies (Mudambi & Schuff, 2010; DiMaggio et al., 1996), was selected as the measure of rating extremity. As shown in Table 1, the correlation between review text extremity and rating standard deviation has a statistically significant positive relationship.

There was a limit to the variables that could be used in the model using the previously collected big data. In addition, it is necessary to utilize other digital market reviews. Better research results could be obtained if big data suitable for the research model can be actively collected after previous research.

5. Conclusions

This study used big data analysis to analyze the impact of product review factors such as helpfulness on product sales performance that are closely related to the final stage of distribution in the digital market. To verify the impact of the helpfulness of product reviews on sales performance, this study employed part-of-speech tagging, which is a text mining technique widely used in big data analysis. The aim was to examine the influence of product review extremities on helpfulness evaluations and confirm their mediating effect on sales performance. The analysis result revealed that the product reviews evaluated as helpful could exert a negative mediating effect on sales performance.

The limitation of this study was that it could not add more factors to produce better research results because it was based on the previously collected big data. In addition, targeting only the Amazon digital market is another limitation. If sufficient resources are provided in the future, more sophisticated research could be conducted through the collection of big data based on previous studies. This study is meaningful in that it explored a new aspect of product review research based on big data.

This study is meaningful as it integrates rating-focused product review research with big data analysis methodology to identify factors affecting the sales performance of digital markets. It also offers implications suggesting that helpfulness of product reviews might lead to adverse outcomes.

This study is a study on the influence of product reviews, which is of high importance to the digital market. Although rating-oriented research has been mainly conducted, this study is differentiated in that it has attempted to incorporate text mining-based research. In addition, the fact that product reviews evaluated as useful can have a negative mediating effect on sales performance has high implications for managers in the digital market who put useful reviews at the forefront.

References

  1. Akter, S., & Wamba, S. F. (2016). Big data analytics in Ecommerce: a systematic review and agenda for future research. Electronic Markets, 26(2), 173-194. https://doi.org/10.1007/s12525-016-0219-0
  2. Amblee, N., & Bui, T. (2011). Harnessing the influence of social proof in online shopping: The effect of electronic word of mouth on sales of digital microproducts. International Journal of Electronic Commerce, 16(2), 91-114. https://doi.org/10.2753/JEC1086-4415160205
  3. Bala, M., & Verma, D. (2018). A critical review of digital marketing. M. Bala, D. Verma (2018). A Critical Review of Digital Marketing. International Journal of Management, IT& Engineering, 8(10), 321-339.
  4. Baron, R. M., & Kenny, D. A. (1986). The moderator-mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51(6), 1173.
  5. Buschken, J., & Allenby, G. M. (2016). Sentence-based text analysis for customer reviews. Marketing Science, 35(6), 953-975. https://doi.org/10.1287/mksc.2016.0993
  6. Cao, Q., Duan, W., & Gan, Q. (2011). Exploring determinants of voting for the "helpfulness" of online user reviews: A text mining approach. Decision Support Systems, 50(2), 511-521. https://doi.org/10.1016/j.dss.2010.11.009
  7. Chadwick, A. (2007). Digital network repertoires and organizational hybridity. Political Communication, 24(3), 283-301. https://doi.org/10.1080/10584600701471666
  8. Chaffey, D., & Smith, P. R. (2013). eMarketing eXcellence: Planning and optimizing your digital marketing. London, UK: Routledge.
  9. Chaffey, D., & Smith, P. R. (2022). Digital marketing excellence: planning, optimizing and integrating online marketing. Newyork, NY: Taylor & Francis.
  10. Chakraborty, I., & Maity, P. (2020). COVID-19 outbreak: Migration, effects on society, global environment and prevention. Science of the Total Environment, 728, 1-7.
  11. Chen, Y. C., Shang, R. A., & Kao, C. Y. (2009). The effects of information overload on consumers' subjective state towards buying decision in the internet shopping environment. Electronic Commerce Research and Applications, 8(1), 48-58. https://doi.org/10.1016/j.elerap.2008.09.001
  12. Chevalier,J. A., & Mayzlin, D. (2006). The effect of word of mouth on sales: Online book reviews. Journal of Marketing Research, 43(3), 345-354. https://doi.org/10.1509/jmkr.43.3.345
  13. Coelho, P. S., & Esteves, S. P. (2007). The choice between a five-point and a ten-point scale in the framework of customer satisfaction measurement. International Journal of Market Research, 49(3), 313-339.
  14. Croarkin, E., Danoff, J., & Barnes, C. (2004). Evidence-based rating of upper-extremity motor function tests used for people following a stroke. Physical Therapy, 84(1), 62-74. https://doi.org/10.1093/ptj/84.1.62
  15. Danescu-Niculescu-Mizil, C., Kossinets, G., Kleinberg, J., & Lee, L. (2009, April). How opinions are received by online communities: a case study on Amazon. com helpfulness votes. Proceedings of the 18th International Conference on World Wide Web (pp. 141-150). April 20-24, Madrid, Spain, Association for Computing Machinery.
  16. De Langhe, B., Fernbach, P. M., & Lichtenstein, D. R. (2016). Navigating by the stars: Investigating the actual and perceived validity of online user ratings. Journal of Consumer Research, 42(6), 817-833. https://doi.org/10.1093/jcr/ucv047
  17. Decker, R., & Trusov, M. (2010). Estimating aggregate consumer preferences from online product reviews. International Journal of Research in Marketing, 27(4), 293-307. https://doi.org/10.1016/j.ijresmar.2010.09.001
  18. Degeratu, A. M., Rangaswamy, A., & Wu, J. (2000). Consumer choice behavior in online and traditional supermarkets: The effects of brand name, price, and other search attributes. International Journal of Research in Marketing, 17(1), 55-78. https://doi.org/10.1016/S0167-8116(00)00005-7
  19. Dellarocas, C., Zhang, X., & Awad, N. F. (2007). Exploring the value of online product reviews in forecasting sales: The case of motion pictures. Journal of Interactive Marketing, 21(4), 23-45. https://doi.org/10.1002/dir.20087
  20. Dhanasobhon, S., Chen, P. Y., Smith, M., & Chen, P. Y. (2007). An analysis of the differential impact of reviews and reviewers at Amazon.com. Proceedings of the International Conference on Information Systems, ICIS 2007, (pp.94). Montreal, Quebec, Canada, Dec. 9-12, Association for Information Systems. 
  21. DiMaggio, P., Evans, J., & Bryson, B. (1996). Have American's social attitudes become more polarized?. American Journal of Sociology, 102(3), 690-755. https://doi.org/10.1086/230995
  22. Dimoka, A., Hong, Y., & Pavlou, P. A. (2012). On product uncertainty in online markets: Theory and evidence. MIS Quarterly, 36(2), 395-426.
  23. Donthu, N., & Gustafsson, A. (2020). Effects of COVID-19 on business and research. Journal of Business Research, 117, 284-289. https://doi.org/10.1016/j.jbusres.2020.06.008
  24. Easterbrook, F. H. (1981). Insider trading, secret agents, evidentiary privileges, and the production of information. The Supreme Court Review, 1981, 309-365. https://doi.org/10.1086/scr.1981.3109548
  25. Eslami, S. P., Ghasemaghaei, M., & Hassanein, K. (2018). Which online reviews do consumers find most helpful? A multimethod investigation. Decision Support Systems, 113, 32-42. https://doi.org/10.1016/j.dss.2018.06.012
  26. Fang, B., Ye, Q., Kucukusta, D., & Law, R. (2016). Analysis of the perceived value of online tourism reviews: Influence of readability and reviewer characteristics. Tourism Management, 52, 498-506. https://doi.org/10.1016/j.tourman.2015.07.018
  27. Feng, S., Xing, L., Gogar, A., & Choi, Y. (2012). Distributional footprints of deceptive product reviews. Proceedings of the The Sixth International AAAI Conference on Weblogs and SocialMedia (pp. 98-105). June 4-7, Dublin, Ireland, Association for the Advancement of Artificial Intelligence.
  28. Filieri, R. (2015). What makes online reviews helpful? A diagnosticity-adoption framework to explain informational and normative influencesin e-WOM. Journal of Business Research, 68(6), 1261-1270. https://doi.org/10.1016/j.jbusres.2014.11.006
  29. Filieri, R. (2016). What makes an online consumer review trustworthy?. Annals of Tourism Research, 58, 46-64. https://doi.org/10.1016/j.annals.2015.12.019
  30. Filieri, R., Raguseo, E., & Vitari, C. (2018). When are extreme ratings more helpful? Empirical evidence on the moderating effects of review characteristics and product type. Computers in Human Behavior, 88, 134-142. https://doi.org/10.1016/j.chb.2018.05.042
  31. Finstad, K. (2010). Response interpolation and scale sensitivity: Evidence against 5-point scales. Journal of Usability Studies, 5(3), 104-110.
  32. Flanagin, A. J., & Metzger, M. J. (2008). The credibility of volunteered geographic information. Geo Journal, 72(3-4), 137-148.
  33. Flanagin, A. J., Metzger, M. J., Pure, R., Markov, A., & Hartsell, E. (2014). Mitigating risk in ecommerce transactions: perceptions of information credibility and the role of user-generated ratings in product quality and purchase intention. Electronic Commerce Research, 14(1), 1-23.
  34. Floyd, K., Freling, R., Alhoqail, S., Cho, H. Y., & Freling, T. (2014). How online product reviews affect retail sales: Ameta-analysis. Journal of Retailing, 90(2), 217-232. https://doi.org/10.1016/j.jretai.2014.04.004
  35. Fuller, J., Jawecki, G., & Muhlbacher, H. (2007). Innovation creation by online basketball communities. Journal of Business Research, 60(1), 60-71.
  36. Garland, R. (1991). The mid-point on a rating scale: Is it desirable. Marketing Bulletin, 2(1), 66-70.
  37. Ghazal, A., Rabl, T., Hu, M., Raab, F., Poess, M., Crolotte, A., & Jacobsen, H. A. (2013, June). Bigbench: Towards an industry standard benchmark for big data analytics. Proceedings of the 2013 ACM SIGMOD International Conference on Management of Data (pp. 1197-1208). June 22-27, New NY, International Conference on Management of Data.
  38. Ghose, A., & Ipeirotis, P. G. (2007). Designing novel review ranking systems: predicting the helpfulness and impact of reviews. Proceedings of the Ninth International Conference on Electronic Commerce (pp. 303-310). August 19-22, New York, NY, Association for Computing Machinery.
  39. Ghose, A., & Ipeirotis, P. G. (2010). Estimating the helpfulness and economic impact of product reviews: Mining text and reviewer characteristics. IEEE Transactions on Knowledge and Data Engineering, 23(10), 1498-1512. https://doi.org/10.1109/TKDE.2010.188
  40. Glaeser, S. (2018). The effects of proprietary information on corporate disclosure and transparency: Evidence from trade secrets. Journal of Accounting and Economics, 66(1), 163-193. https://doi.org/10.1016/j.jacceco.2018.04.002
  41. Granados, N. F., Gupta, A., & Kauffman, R. J. (2006). The impact of IT on market information and transparency: A unified theoretical framework. Journal of the Association for Information Systems, 7(3), 7.
  42. Gu, S., Slusarczyk, B., Hajizada, S., Kovalyova, I., & Sakhbieva, A. (2021). Impact of the covid-19 pandemic on online consumer purchasing behavior. Journal of Theoretical and Applied Electronic Commerce Research, 16(6), 2263-2281. https://doi.org/10.3390/jtaer16060125
  43. Gursoy, D., Malodia, S., & Dhir, A. (2022). The metaverse in the hospitality and tourism industry: An overview of current trends and future research directions. Journal of Hospitality Marketing & Management, 31(5), 527-534. https://doi.org/10.1080/19368623.2022.2072504
  44. Haque, T. U., Saber, N. N., & Shah, F. M. (2018, May). Sentiment analysis on large scale Amazon product reviews. Proceedings of the 2018 IEEE International Conference on Innovative Research and Development (ICIRD) (pp. 1-6). May 11-12, Bangkok, Thailand, IEEE.
  45. Hashem, T. N. (2020). Examining the influence of covid-19 pandemic in changing customers' orientation towards e-shopping. Modern Applied Science, 14(8), 59-76. https://doi.org/10.5539/mas.v14n8p59
  46. He, R., & McAuley, J. (2016, April). Ups and downs: Modeling the visual evolution of fashion trends with one-class collaborative filtering. Proceedings of the 25th International Conference on World Wide Web (pp. 507-517). April 11-15, Montreal, Quebec, Canada, International World Wide Web Conferences Steering Committee.
  47. Hirschberg, J., & Manning, C. D. (2015). Advances in natural language processing. Science, 349(6245), 261-266. https://doi.org/10.1126/science.aaa8685
  48. Hu, N., Koh, N. S., & Reddy, S. K. (2014). Ratings lead you to the product, reviews help you clinch it? The mediating role of online review sentiments on product sales. Decision Support Systems, 57, 42-53.
  49. Hu, N., Liu, L., & Zhang, J. J. (2008). Do online reviews affect product sales? The role of reviewer characteristics and temporal effects. Information Technology and Management, 9(3), 201-214. https://doi.org/10.1007/s10799-008-0041-2
  50. Jeyapriya, A., & Selvi, C. K. (2015, February). Extracting aspects and mining opinions in product reviews using supervised learning algorithm. Proceedings of the 2015 2nd International Conference on Electronics and Communication Systems (ICECS) (pp. 548-552). Feb. 26-27, Coimbatore, India, IEEE.
  51. Kannan, P. K. (2017). Digital marketing: A framework, review and research agenda. International Journal of Research in Marketing, 34(1), 22-45. https://doi.org/10.1016/j.ijresmar.2016.11.006
  52. Kapferer, J. N. (2008). The new strategic brand management: Creating and sustaining brand equity long term. London, UK: Kogan Page Publishers.
  53. Kaushik, K., Mishra, R., Rana, N. P., & Dwivedi, Y. K. (2018). Exploring reviews and review sequences on e-commerce platform: Astudy of helpful reviews on Amazon. in. Journal of Retailing and Consumer Services, 45, 21-32. https://doi.org/10.1016/j.jretconser.2018.08.002
  54. Killian, G., & McManus, K. (2015). A marketing communications approach for the digital era: Managerial guidelines for social media integration. Business Horizons, 58(5), 539-549. https://doi.org/10.1016/j.bushor.2015.05.006
  55. Kim, W. G., Li, J. J., & Brymer, R. A. (2016). The impact of social media reviews on restaurant performance: The moderating role of excellence certificate. International Journal of Hospitality Management, 55, 41-51.
  56. Kopalle, P. K., Fisher, R. J., Sud, B. L., & Antia, K. D. (2017). The effects of advertised quality emphasis and objective quality on sales. Journal of Marketing, 81(2), 114-126. https://doi.org/10.1509/jm.15.0353
  57. Kreimeyer, K., Foster, M., Pandey, A., Arya, N., Halford, G., Jones, S. F., ... & Botsis, T. (2017). Natural language processing systems for capturing and standardizing unstructured clinical information: a systematic review. Journal of Biomedical Informatics, 73, 14-29.
  58. Krishnamoorthy, S. (2015). Linguistic features for review helpfulness prediction. Expert Systems with Applications, 42(7), 3751-3759. https://doi.org/10.1016/j.eswa.2014.12.044
  59. Kuan, K. K., Hui, K. L., Prasarnphanich, P., & Lai, H. Y. (2015). What makes a review voted? An empirical investigation of review voting in online review systems. Journal of the Association for Information Systems, 16(1), 1.
  60. Kuan, K. K., Hui, K. L., Prasarnphanich, P., & Lai, H. Y. (2015). What makes a review voted? An empirical investigation of review voting in online review systems. Journal of the Association for Information Systems, 16(1), 48-71. https://doi.org/10.17705/1jais.00386
  61. Labrecque, L. I., Vor Dem Esche, J., Mathwick, C., Novak, T. P., & Hofacker, C. F. (2013). Consumer power: Evolution in the digital age. Journal of Interactive Marketing, 27(4), 257-269. https://doi.org/10.1016/j.intmar.2013.09.002
  62. Lamberton, C., & Stephen, A. T. (2016). A thematic exploration of digital, social media, and mobile marketing: Research evolution from 2000 to 2015 and an agenda for future inquiry. Journal of Marketing, 80(6), 146-172. https://doi.org/10.1509/jm.15.0415
  63. Lee, J., Park, D. H., & Han, I. (2008). The effect of negative online consumer reviews on product attitude: An information processing view. Electronic Commerce Research and Applications, 7(3), 341-352. https://doi.org/10.1016/j.elerap.2007.05.004
  64. Lee, S., & Choeh, J. Y. (2020). The impact of online review helpfulness and word of mouth communication on box office performance predictions. Humanities and Social Sciences Communications, 7(1), 1-12.
  65. Levin, A. M., Levin, I. R., & Heath, C. E. (2003). Product category dependent consumer preferences for online and offline shopping features and their influence on multi-channel retail alliances. Journal of Electronic Commerce Research, 4(3), 85-93.
  66. Li, M., Huang, L., Tan, C. H., & Wei, K. K. (2013). Helpfulness of online product reviews as seen by consumers: Source and content features. International Journal of Electronic Commerce, 17(4), 101-136.
  67. Meyers-Levy, J., & Tybout, A. M. (1989). Schema congruity as a basis for product evaluation. Journal of Consumer Research, 16(1), 39-54. https://doi.org/10.1086/209192
  68. Miles, M. B. (1979). Qualitative data as an attractive nuisance: The problem of analysis. Administrative Science Quarterly, 24(4), 590-601. https://doi.org/10.2307/2392365
  69. Moon, S., Bergey, P. K., & Iacobucci, D. (2010). Dynamic effects among movie ratings, movie revenues, and viewer satisfaction. Journal of Marketing, 74(1), 108-121. https://doi.org/10.1509/jmkg.74.1.108
  70. Moon, S., Park, Y., & Seog Kim, Y. (2014). The impact of text product reviews on sales. European Journal of Marketing, 48(11/12), 2176-2197. https://doi.org/10.1108/EJM-06-2013-0291
  71. Mudambi, S. M., & Schuff, D. (2010). Research note: What makes a helpful online review? A study of customer reviews on Amazon. com. MIS Quarterly, 34(1), 185-200. https://doi.org/10.2307/20721420
  72. Murphy, G. B., Trailer, J. W., & Hill, R. C. (1996). Measuring performance in entrepreneurship research. Journal of Business Research, 36(1), 15-23.
  73. Nadeem, W., Juntunen, M., Hajli, N., & Tajvidi, M. (2021). The role of ethical perceptions in consumers' participation and value co-creation on sharing economy platforms. Journal of Business Ethics, 169(3), 421-441.
  74. Nadkarni, P. M., Ohno-Machado, L., & Chapman, W. W. (2011). Natural language processing: an introduction. Journal of the American Medical Informatics Association, 18(5), 544-551. https://doi.org/10.1136/amiajnl-2011-000464
  75. Olbrich, R., & D. Schultz, C. (2014). Multichannel advertising: Does print advertising affect search engine advertising?. European Journal of Marketing, 48(9/10), 1731-1756. https://doi.org/10.1108/EJM-10-2012-0569
  76. Owoputi, O., O'Connor, B., Dyer, C., Gimpel, K., Schneider, N., & Smith, N. A. (2013, June). Improved part-of-speech tagging for online conversational text with word clusters. Proceedings of the 2013 conference of the North American chapter of the association for computational linguistics: human language technologies (pp. 380-390). June 9-14, Atlanta, Georgia, Association for Computational Linguistics.
  77. Park, D. H., & Lee, J. (2008). eWOM overload and its effect on consumer behavioral intention depending on consumer involvement. Electronic Commerce Research and Applications, 7(4), 386-398. https://doi.org/10.1016/j.elerap.2007.11.004
  78. Park, D. H., Lee, J., & Han, I. (2007). The effect of on-line consumer reviews on consumer purchasing intention: The moderating role of involvement. International Journal of Electronic Commerce, 11(4), 125-148.
  79. Patrick, D. L., Burke, L. B., Gwaltney, C. J., Leidy, N. K., Martin, M. L., Molsen, E., & Ring, L. (2011). Content validity-establishing and reporting the evidence in newly developed patient-reported outcomes (PRO) instruments for medical product evaluation: ISPOR PRO good research practices task force report: part 1-eliciting concepts for a new PRO instrument. Value in Health, 14(8), 967-977. https://doi.org/10.1016/j.jval.2011.06.014
  80. Pavlou, P. A., & Gefen, D. (2005). Psychological contract violation in online marketplaces: Antecedents, consequences, and moderating role. Information Systems Research, 16(4), 372-399. https://doi.org/10.1287/isre.1050.0065
  81. Prahalad, C. K., & Ramaswamy, V. (2000). Co-opting customer competence. Harvard Business Review, 78(1), 79-90.
  82. Rahmati, P., Tafti, A. R., Westland, J. C., & Hidalgo, C. (2021). When all products are digital: Complexity and intangible value in the ecosystem of digitizing firms. Forthcoming, MIS Quarterly,45(3), 1025-1058.
  83. Rao, S., Goldsby, T. J., & Iyengar, D. (2009). The marketing and logistics efficacy of online sales channels. International Journal of Physical Distribution & Logistics Management, 39(2), 106-130. https://doi.org/10.1108/09600030910942386
  84. Rapp, A., Beitelspacher, L. S., Grewal, D., & Hughes, D. E. (2013). Understanding social media effects across seller, retailer, and consumer interactions. Journal of the Academy of Marketing Science, 41(5), 547-566. https://doi.org/10.1007/s11747-013-0326-9
  85. Ravi, K., & Ravi, V. (2015). A survey on opinion mining and sentiment analysis: tasks, approaches and applications. Knowledge-based Systems, 89, 14-46. https://doi.org/10.1016/j.knosys.2015.06.015
  86. Reyes, A., & Rosso, P. (2012). Making objective decisions from subjective data: Detecting irony in customer reviews. Decision Support Systems, 53(4), 754-760.
  87. Roche, J. R., Dillon, P. G., Stockdale, C. R., Baumgard, L. H., & VanBaale, M. J. (2004). Relationships among international body condition scoring systems. Journal of Dairy Science, 87(9), 3076-3079. https://doi.org/10.3168/jds.S0022-0302(04)73441-4
  88. Ryan, D. (2016). Understanding digital marketing: marketing strategies for engaging the digital generation. London, UK: Kogan Page Publishers.
  89. Salehan, M., & Kim, D. J. (2016). Predicting the performance of online consumer reviews: A sentiment mining approach to big data analytics. Decision Support Systems, 81, 30-40. https://doi.org/10.1016/j.dss.2015.10.006
  90. Satista(2021), E-commerce market share of leading e-retailers worldwide in 2020, based on GMV, statista.com, https://www.statista.com/statistics/664814/global-ecommerce-market-share/, last modified Nov 10, 2021, accessed November 30, 2023,
  91. Sen, S., & Lerman, D. (2007). Why are you telling me this? An examination into negative consumer reviews on the web. Journal of Interactive Marketing, 21(4), 76-94. https://doi.org/10.1002/dir.20090
  92. Sharma, A., & Jhamb, D. (2020). Changing consumer behaviours towards online shopping-an impact of Covid 19. Academy of Marketing Studies Journal, 24(3), 1-10.
  93. Shen, W., Hu, Y. J., & Ulmer, J. R. (2015). Competing for attention. Mis Quarterly, 39(3), 683-696. https://doi.org/10.25300/MISQ/2015/39.3.08
  94. Skowronski, J. J., & Carlston, D. E. (1989). Negativity and extremity biases in impression formation: A review of explanations. Psychological Bulletin, 105(1), 131.
  95. Sobel, M. E. (1982). Asymptotic confidence intervals for indirect effects in structural equation models. Sociological Methodology, 13, 290-312. https://doi.org/10.2307/270723
  96. Sun, S., Luo, C., & Chen, J. (2017). A review of natural language processing techniquesfor opinion mining systems. Information Fusion, 36, 10-25.
  97. Sutton, J., & Austin, Z. (2015). Qualitative research: Data collection, analysis, and management. The Canadian Journal of Hospital Pharmacy, 68(3), 226.
  98. Svahn, F., Mathiassen, L., & Lindgren, R. (2017). Embracing digital innovation in incumbent firms. MIS Quarterly, 41(1), 239-254. https://doi.org/10.25300/MISQ/2017/41.1.12
  99. Taiminen, H. M., & Karjaluoto, H. (2015). The usage of digital marketing channels in SMEs. Journal of Small Business and Enterprise Development, 22(4), 633-651. https://doi.org/10.1108/JSBED-05-2013-0073
  100. Taken Smith, K. (2012). Longitudinal study of digital marketing strategies targeting Millennials. Journal of Consumer Marketing, 29(2), 86-92.
  101. Tarka, P. (2017). The comparison of estimation methods on the parameter estimates and fit indices in SEM model under 7-point Likert scale. Archives of Data Science, 2(1), 1-16.
  102. Thabit, T., & Raewf, M. (2018). The evaluation of marketing mix elements: A case study. International Journal of Social Sciences & Educational Studies, 4(4), 100-109.
  103. Tiago, M. T. P. M. B., & Verissimo, J. M. C. (2014). Digital marketing and social media: Why bother?. Business Horizons, 57(6), 703-708. https://doi.org/10.1016/j.bushor.2014.07.002
  104. Wang, Y., Wang, J., & Yao, T. (2019). What makes a helpful online review? A meta-analysis of review characteristics. Electronic Commerce Research, 19, 257-284.
  105. Wisner, J. D. (2020). Operations management: A supply chain process approach. Solana Beach, CA: Cognella Academic Publishing.
  106. Yang, S., & Ghose, A. (2010). Analyzing the relationship between organic and sponsored search advertising: Positive, negative, or zero interdependence?. Marketing Science, 29(4), 602-623. https://doi.org/10.1287/mksc.1090.0552
  107. Ye, Q., Law, R., & Gu, B. (2009). The impact of online user reviews on hotel room sales. International Journal of Hospitality Management, 28(1), 180-182.
  108. Yi, J., Nasukawa, T., Bunescu, R., & Niblack, W. (2003, November). Sentiment analyzer: Extracting sentiments about a given topic using natural language processing techniques. Proceedings of the Third IEEE International Conference on Data Mining (pp. 427-434). Nov. 19-22, Melbourne, Florida, IEEE.
  109. Zhang, K., Cheng, Y., Liao, W. K., & Choudhary, A. (2011). Mining millions of reviews: a technique to rank products based on importance of reviews. Proceedings of the 13th International Conference on Electronic Commerce (pp. 1-8). Aug. 3-5, Liverpool UK, Association for Computing Machinery.
  110. Zhao, H., Liu, Z., Yao, X., & Yang, Q. (2021). Amachine learning-based sentiment analysis of online product reviews with a novel term weighting and feature selection approach. Information Processing & Management, 58(5), 102656.
  111. Zhou, J., Xu, R., & Jiang, C. (2022). Sales volume and online reviews in a sequential game. Journal of the Operational Research Society, 74(10), 1-12.
  112. Zikmund, W. G., & Babin, B. J. (2015). Essentials of Marketing Research. Boston, MA: Cengage Learning.