5,331 research outputs found
A methodology for analyzing commercial processor performance numbers
The wealth of performance numbers provided by benchmarking corporations makes it difficult to detect trends across commercial machines. A proposed methodology, based on statistical data analysis, simplifies exploration of these machines' large datasets
Study of femtosecond laser beam focusing in a direct-write system
En col·laboració amb la Universitat Autònoma de Barcelona (UAB) i la Universitat de Barcelona (UB).Direct-write techniques appear as a versatile option in rapid-prototyping applications because they can directly transfer a custom pattern from a digital file. Lasers are a distinguished tool which allow to perform non-contact direct-write techniques with the ability to add, remove and modify different types of materials. Moreover, they have a high focusing power and offer high spatial resolution when a femtosecond laser is used due to the reduction of thermal effects. Additive and subtractive techniques can be performed in one laser-based direct-write system with minimum variations in the setup. In all cases, properties of the laser beam, such as the beam width or the morphology of the intensity distribution have an effect on the results of the laser processing. The aim of this work is the study of the laser propagation in a specific laser-based direct-write setup. The beam intensity distribution effects are measured at different positions and compared with simulations. The influence of the main parameters, pupil displacement and objective tilt, on the morphological properties of the intensity distribution is analysed. Well defined spots with good reproducibility are obtained. In addition, at comparing the simulation with the experiments, the origin of some morphological properties are reported and they can be used to optimize the setup
Identifying Sorting - In Theory
We argue that using wage data alone, it is virtually impossible to identify whether Assortative Matching between worker and firm types is positive or negative. In standard competitive matching models the wages are determined by the marginal contribution of a worker, and the marginal contribution might be higher or lower for low productivity firms depending on the production function. For every production function that induces positive sorting we can find a production function that induces negative sorting but generates identical wages. This arises even when we allow for non-competitive mismatch, for example due to search frictions. Even though we cannot identify the sign of the sorting, we can identify the strength, i.e., the magnitude of the cross-partial, and the associated welfare loss. While we show analytically that standard fixed effects regressions are not suitable to recover the strength of sorting, we propose an alternative procedure that measures the strength of sorting in the presence of search frictions independent of the sign of the sorting.sorting, assortative matching, identification, linked employer-employee data, interpretation of fixed-effects
Occupational Choice and Development
The rise in world trade since 1970 has raised international mobility of labor services. We study the effect of such a globalization of the world's labor markets. We find that when people can choose between wage work and managerial work, the output gains are U-shaped: A worldwide labor market raises output by more in the rich and the poor countries, and by less in the middle-income countries. This is because the middle-income countries experience the smallest change in the factor-price ratio, and where the option to choose between wage work and managerial work has the least value in the integrated economy. Our theory also establishes that after economic integration, the high skill countries see a disproportionate increase in managerial occupations. Using aggregate data on GDP, openness and occupations from 115 countries, we find evidence for these patterns of occupational choice.
Interval simulation: raising the level of abstraction in architectural simulation
Detailed architectural simulators suffer from a long development cycle and extremely long evaluation times. This longstanding problem is further exacerbated in the multi-core processor era. Existing solutions address the simulation problem by either sampling the simulated instruction stream or by mapping the simulation models on FPGAs; these approaches achieve substantial simulation speedups while simulating performance in a cycle-accurate manner This paper proposes interval simulation which rakes a completely different approach: interval simulation raises the level of abstraction and replaces the core-level cycle-accurate simulation model by a mechanistic analytical model. The analytical model estimates core-level performance by analyzing intervals, or the timing between two miss events (branch mispredictions and TLB/cache misses); the miss events are determined through simulation of the memory hierarchy, cache coherence protocol, interconnection network and branch predictor By raising the level of abstraction, interval simulation reduces both development time and evaluation time. Our experimental results using the SPEC CPU2000 and PARSEC benchmark suites and the MS multi-core simulator show good accuracy up to eight cores (average error of 4.6% and max error of 11% for the multi-threaded full-system workloads), while achieving a one order of magnitude simulation speedup compared to cycle-accurate simulation. Moreover interval simulation is easy to implement: our implementation of the mechanistic analytical model incurs only one thousand lines of code. Its high accuracy, fast simulation speed and ease-of-use make interval simulation a useful complement to the architect's toolbox for exploring system-level and high-level micro-architecture trade-offs
- …
