Tuesday, Aug 5: 8:30 AM - 10:20 AM
0846
Topic-Contributed Paper Session
Music City Center
Room: CC-201A
Applied
Yes
Main Sponsor
Business and Economic Statistics Section
Co Sponsors
Business Analytics/Statistics Education Interest Group
Section on Risk Analysis
Presentations
In this study, we present a novel statistical approach to corporate bankruptcy prediction by leveraging complex network analysis. We introduce a two-layered network structure that captures both supply chain relationships and investment-co-investment patterns among companies, providing a more comprehensive view of corporate interdependencies than traditional methods. To analyze this complex structure, we develop a flexible multi-layered latent position model that efficiently extracts key features from the network. Our methodology employs advanced statistical techniques to estimate latent positions underlying this two-layered network, which are then utilized as predictors in a bankruptcy prediction model. Using the US public company data, we demonstrate that incorporating these network-derived features significantly enhances the predictive power of bankruptcy models. Our results reveal that these latent positions estimated from network structure capture crucial relational information that is highly relevant to a company's financial stability. This approach not only outperforms traditional prediction methods but also provides interpretable insights into the role of corporate interconnectedness in financial risk. Our work aims to offer a robust statistical framework for integrating complex relational data into predictive modeling for bankruptcy risk assessment.
Keywords
Network Analysis
Corporate Bankruptcy
Speaker
Tianhai Zu, University of Texas at San Antonio
In the evolving landscape of digital commerce, adaptive dynamic pricing strategies are essential for gaining a competitive edge. This paper introduces novel doubly nonparametric random utility models that eschew traditional parametric assumptions used in estimating consumer demand's mean utility function and noise distribution. Existing nonparametric methods like multi-scale Distributional Nearest Neighbors (DNN and TDNN), initially designed for offline regression, face challenges in dynamic online pricing due to design limitations, such as the indirect observability of utility-related variables and the absence of uniform convergence guarantees. We address these challenges with innovative population equations that facilitate nonparametric estimation within decision-making frameworks and establish new analytical results on the uniform convergence rates of DNN and TDNN, enhancing their applicability in dynamic environments. Our theoretical analysis confirms that the statistical learning rates for the mean utility function and noise distribution are minimax optimal. We also derive a regret bound that illustrates the critical interaction between model dimensionality and noise distribution smoothness, deepening our understanding of dynamic pricing under varied market conditions. These contributions offer substantial theoretical insights and practical tools for implementing effective, data-driven pricing strategies, advancing the theoretical framework of pricing models and providing robust methodologies for navigating the complexities of modern markets.
Keywords
Dynamic pricing
Nonparametric
Regret analysis
Speaker
Lan Gao, University of Tennessee Knoxville
We propose a factor-based method for estimating treatment effects in panel data models with spatial interference. We focus on a binary treatment in a non-experimental setting and characterise the potential outcomes by a modified factor model that allows for interference between any two units. We also provide two economic illustrations of this factor model. The estimation of treatment effects is recast as disentangling sub-vectors of factors and loadings from the full vectors and accomplished by exploiting the factor structure implied by the model. Under standard assumptions, the estimator of every individual and time specific treatment effect is proved to be consistent and asymptotically normal as the numbers of units, pre-treatment and post-treatment times go to infinity. We find consistent estimators for the associated asymptotic variances, which leads to asymptotically pivotal inference on the treatment effects. This method can be extended to models with covariates.
Dimension reduction is vital for high-dimensional data analysis, yet selecting the intrinsic dimension presents significant challenges due to variability across methods and a lack of consensus on criteria. This study introduces novel hypothesis testing with semiparametric and parametric bootstrap-based approaches to quantify the uncertainty associated with determining the intrinsic dimension. We develop efficient algorithms to construct confidence intervals with desirable coverage rates and valid type I error for hypothesis testing. We apply our method to daily stock returns and identify the intrinsic dimension that effectively captures the data structure. Confidence intervals at different alpha levels are constructed to assess uncertainty. This study provides a systematic framework for uncertainty quantification in dimension reduction, with applicability to different dimension reduction techniques.
Keywords
Uncertainty Quantification
Dimension Reduction
Stock Market Analysis
Principal Component Analysis