1,076 research outputs found
Non-price Competition, Real Rigidities and Inflation Dynamics
In the last decade, the analytical progress achieved in the New Keynesian literature has been remarkable. Many of the early assumptions have been relaxed, leading to medium-scale macroeconomic models that are now able to capture many features of real-world data. Nevertheless, modern-day New Keynesian models still assume, as did their early counterparts, that firms compete in the market with no tools other than their relative prices. In particular, this literature has so far neglected the consequences of extending competition between firms to the non-price dimension. This paper tries to fill this gap by enriching the canonical New Keynesian framework to include both price and non-price competition. This has important consequences for the analysis of inflation dynamics, modifying in particular the inflation-marginal cost relationship. As a general result, we show that any activity by firms that boosts demand for their products, without directly affecting their prices, dampens the overall degree of real rigidities in price-setting.Non-price competition, inflation dynamics, real rigidity
Advertising, Labor Supply and the Aggregate Economy. A long run Analysis
This paper studies the influence of persuasive advertising in a neoclassical growth model with monopolistically competitive firms. Our findings show that advertising can significantly affect the stationary equilibrium of a model economy in which the labor supply is endogenous. In this case, for empirically plausible calibrations, we find that the equilibrium level of hours worked, GDP, and consumption increase with the amount of resources invested in advertising. These findings are consistent with a new stylized fact provided in this paper: over the past decade, per-capita advertising expenditures have been positively correlated with per-capita output, consumption and hours worked across OECD countries. Because of the connection between advertising and labor supply, we show that our model improves on its neoclassical counterpart in explaining both within-country and cross-country variability of hours worked per capita.Advertising, Labor Wedge, Labor supply, Economic Growth, Hours Worked.
The last fifteen years of stagnation in Italy: A Business Cycle Accounting Perspective
In this paper, we investigate possible sources of declining economic growth performance in Italy starting around the middle of the ’90s. A long-run data analysis suggests that the poor performance of the Italian economy cannot be ascribed to an unfortunate business cycle contingency. The rest of the euro area countries have shown better performance, and the macroeconomic data show that the Italian economy has not grown as rapidly as these other European economies. We investigate the sources of economic fluctuations in Italy by applying the Business Cycle Accounting procedure introduced by Chari, Kehoe and McGrattan (2007). We analyze the relative importance of efficiency, labor, investment and government wedges for business cycles in Italy over the 1982-2008 period. We find that different wedges have played different roles during the period, but the efficiency wedge is revealed to be the main factor responsible for the stagnation phase beginning around 1995. Our findings also show that the improvement in labor market distortions that occurred in Italy during the ’90s provided an alleviating effect, preventing an even stronger slowdown in per capita output growth.
Advertising and Business Cycle Fluctuations
This paper provides new empirical evidence for quarterly U.S. aggregate advertisingexpenditures, showing that advertising has a well defined pattern over the BusinessCycle. To understand this pattern we develop a general equilibrium model wheretargeted advertising increases the marginal utility of the advertised good. Advertisingintensity is endogenously determined by profit maximizing firms. We embed thisassumption into an otherwise standard model of the business cycle withmonopolistic competition. We find that advertising affects the aggregate dynamics ina relevant way, and it exacerbates the welfare costs of fluctuations for the consumer.Finally, we provide estimates of our setup using Bayesian techniques.Advertising, DSGE model, Business Cycle fluctuations, Bayesian
Size, Trend, and Policy Implications of the Underground Economy
We study the underground economy in a dynamic and stochastic general equilibrium framework. Our model combines limited tax enforcement with an otherwise standard two-sector neoclassical stochastic growth model. The Bayesian estimation of the model based on Italian data provides evidence in favor of an important underground sector in Italy, with a size that has steadily increased over the whole sample period. We show that this pattern is due to a persistent increase in taxation. Fiscal policy experiments suggest that a moderate tax cut, along with a stronger effort in the monitoring process, causes a sensitive reduction in the size of the underground economy and positive stimulus to the regular sector that jointly increase the total fiscal revenues
The last fifteen years of stagnation in Italy: A Business Cycle Accounting Perspective
In this paper, we investigate possible sources of declining economic growth performance in Italy starting around the middle of the ’90s. A long-run data analysis suggests that the poor performance of the Italian economy cannot be ascribed to an unfortunate business cycle contingency. The rest of the euro area countries have shown better performance, and the macroeconomic data show that the Italian economy has not grown as rapidly as these other European economies. We investigate the sources of economic fluctuations in Italy by applying the Business Cycle Accounting procedure introduced by Chari, Kehoe and McGrattan (2007). We analyze the relative importance of efficiency, labor, investment and government wedges for business cycles in Italy over the 1982-2008 period. We find that different wedges have played different roles during the period, but the efficiency wedge is revealed to be the main factor responsible for the stagnation phase beginning around 1995. Our findings also show that the improvement in labor market distortions that occurred in Italy during the ’90s provided an alleviating effect, preventing an even stronger slowdown in per capita output growth
Size, trend, and policy implications of the underground economy
We study the underground economy within a dynamic and stochastic general equilibrium framework. Our model combines limited tax enforcement with an otherwise standard two-sector neoclassical stochastic growth model. The Bayesian estimation of the model based on Italian data provides evidence in favor of an important underground sector in Italy, with a size that has increased steadily over the whole sample period. We show that this pattern is due to a steady increase in taxation. Fiscal policy experiments suggest that a moderate tax cut, along with a stronger effort in the monitoring process, causes a sizeable reduction in the size of the underground economy and provides a positive stimulus for the regular economy. Both of these effects jointly increase total fiscal revenues.Francesco Turino is grateful for the financial support provided by the Spanish Ministerio de Educación y Ciencia and FEDER funds (project SEJ-2007-62656/ECON)
Comparative analysis of predictive methods for early assessment of compliance with continuous positive airway pressure therapy
Background: Patients suffering obstructive sleep apnea are mainly treated with continuous positive airway pressure
(CPAP). Although it is a highly effective treatment, compliance with this therapy is problematic to achieve with serious
consequences for the patients’ health. Unfortunately, there is a clear lack of clinical analytical tools to support the early
prediction of compliant patients.
Methods: This work intends to take a further step in this direction by building compliance classifiers with CPAP
therapy at three different moments of the patient follow-up, before the therapy starts (baseline) and at months 1 and
3 after the baseline.
Results: Results of the clinical trial shows that month 3 was the time-point with the most accurate classifier reaching
an f1-score of 87% and 84% in cross-validation and test. At month 1, performances were almost as high as in month 3
with 82% and 84% of f1-score. At baseline, where no information of patients’ CPAP use was given yet, the best
classifier achieved 73% and 76% of f1-score in cross-validation and test set respectively. Subsequent analyzes carried
out with the best classifiers of each time point revealed baseline factors (i.e. headaches, psychological symptoms,
arterial hypertension and EuroQol visual analog scale) closely related to the prediction of compliance independently
of the time-point. In addition, among the variables taken only during the follow-up of the patients, Epworth and the
average nighttime hours were the most important to predict compliance with CPAP.
Conclusions: Best classifiers reported high performances after one month of treatment, being the third month when
significant differences were achieved with respect to the baseline. Four baseline variables were reported relevant for
the prediction of compliance with CPAP at each time-point. Two characteristics more were also highlighted for the
prediction of compliance at months 1 and 3.This work is part of the myOSA project (RTC-2014-3138-1), funded by the Spanish Ministry of Economy and Competitiveness (Ministerio de Economía y Competitividad) under the framework “Retos-Colaboración”, State Scientific and Technical Research and Innovation Plan 2013-2016. The study was also partially funded by the European Community under “H2020-EU.3.1. – Societal Challenges – Health, demographic change and well-being” programme, project grant agreement number 689802 (CONNECARE)
Comparative Analysis of Decision Tree, Random Forest, Svm, and Neural Network Models for Predicting Earthquake Magnitude
This study conducts a comparative analysis of four machine learning algorithms—Decision Tree, Random Forest, Support Vector Machine (SVM), and Neural Network—to predict earthquake magnitudes using the United States Geological Survey (USGS) earthquake dataset. The analysis evaluates each model's performance based on key metrics: Mean Absolute Error (MAE), Root Mean Squared Error (RMSE), and the coefficient of determination (R²). The Random Forest model demonstrated superior performance, achieving the lowest MAE (0.217051), lowest RMSE (0.322398), and highest R² (0.574261), indicating its robustness in capturing complex, non-linear relationships in seismic data. SVM also showed strong performance, with competitive accuracy and robustness. Decision Tree and Neural Network models, while useful, had comparatively higher error rates and lower R² values. The study highlights the potential of ensemble learning and kernel methods in enhancing earthquake magnitude prediction accuracy. Practical implications of the findings include the integration of these models into early warning systems, urban planning, and the insurance industry for better risk assessment and management. Despite the promising results, the study acknowledges limitations such as reliance on historical data and the computational intensity of certain models. Future research is suggested to explore additional data sources, advanced machine learning techniques, and more efficient algorithms to further improve predictive capabilities. By providing a comprehensive evaluation of these models, this research contributes valuable insights into the effectiveness of various machine learning techniques for earthquake prediction, guiding future efforts to develop more accurate and reliable predictive models
FROM ESSENTIALISM TO THE ESSENTIAL: PRAGMATICS AND MEANING OF PUNEÑO SIKURI PERFORMANCE IN LIMA
- …
