194 research outputs found

    Telework Configurations and Labour Productivity: some stylized facts

    Get PDF
    The development of information and communication technologies has led to the rise of new working forms in firms, some of which are temporally and spatially dispersed, such as telework practices. However, ‘telework’ is a broad concept, including different forms of remote work as well as diverse reasons and performance implications for the separation of work from the firm’s premises. Following this consideration, this paper has explored two sides of telework: 1) the main types of telework practises adopted by firms in relation to their technological, organizational and environmental context; 2) the association between the adoption of telework practices and labour productivity. Specifically, analysing data gathered through a survey analysis conducted from 2005 and 2009 on Italian enterprises, we identified two main typologies of telework: 1) firms using forms of home‐based telework; 2) firms using mobile forms of telework. Whereas firms prevalently using the first type of telework modality do not exhibit a superior endowment of information systems and do not exhibit higher labour productivity, firms deploying “mobile work” practices are characterized by a higher adoption of information systems, deal with more dynamic business environments and exhibit higher labour productivity with respect to firms that do not use telework practices

    The Approach to Ergodicity in Monte Carlo Simulations

    Get PDF
    The approach to the ergodic limit in Monte Carlo simulations is studied using both analytic and numerical methods. With the help of a stochastic model, a metric is defined that enables the examination of a simulation in both the ergodic and non-ergodic regimes. In the non-ergodic regime, the model implies how the simulation is expected to approach ergodic behavior analytically, and the analytically inferred decay law of the metric allows the monitoring of the onset of ergodic behavior. The metric is related to previously defined measures developed for molecular dynamics simulations, and the metric enables the comparison of the relative efficiencies of different Monte Carlo schemes. Applications to Lennard-Jones 13-particle clusters are shown to match the model for Metropolis, J-walking and parallel tempering based approaches. The relative efficiencies of these three Monte Carlo approaches are compared, and the decay law is shown to be useful in determining needed high temperature parameters in parallel tempering and J-walking studies of atomic clusters.Comment: 17 Pages, 7 Figure

    Inference by replication in densely connected systems

    Get PDF
    An efficient Bayesian inference method for problems that can be mapped onto dense graphs is presented. The approach is based on message passing where messages are averaged over a large number of replicated variable systems exposed to the same evidential nodes. An assumption about the symmetry of the solutions is required for carrying out the averages; here we extend the previous derivation based on a replica symmetric (RS) like structure to include a more complex one-step replica symmetry breaking (1RSB)-like ansatz. To demonstrate the potential of the approach it is employed for studying critical properties of the Ising linear perceptron and for multiuser detection in Code Division Multiple Access (CDMA) under different noise models. Results obtained under the RS assumption in the non-critical regime give rise to a highly efficient signal detection algorithm in the context of CDMA; while in the critical regime one observes a first order transition line that ends in a continuous phase transition point. Finite size effects are also observed. While the 1RSB ansatz is not required for the original problems, it was applied to the CDMA signal detection problem with a more complex noise model that exhibits RSB behaviour, resulting in an improvement in performance.Comment: 47 pages, 7 figure

    Dynamical replica theoretic analysis of CDMA detection dynamics

    Full text link
    We investigate the detection dynamics of the Gibbs sampler for code-division multiple access (CDMA) multiuser detection. Our approach is based upon dynamical replica theory which allows an analytic approximation to the dynamics. We use this tool to investigate the basins of attraction when phase coexistence occurs and examine its efficacy via comparison with Monte Carlo simulations.Comment: 18 pages, 2 figure

    Dynamical transitions in the evolution of learning algorithms by selection

    Get PDF
    We study the evolution of artificial learning systems by means of selection. Genetic programming is used to generate a sequence of populations of algorithms which can be used by neural networks for supervised learning of a rule that generates examples. In opposition to concentrating on final results, which would be the natural aim while designing good learning algorithms, we study the evolution process and pay particular attention to the temporal order of appearance of functional structures responsible for the improvements in the learning process, as measured by the generalization capabilities of the resulting algorithms. The effect of such appearances can be described as dynamical phase transitions. The concepts of phenotypic and genotypic entropies, which serve to describe the distribution of fitness in the population and the distribution of symbols respectively, are used to monitor the dynamics. In different runs the phase transitions might be present or not, with the system finding out good solutions, or staying in poor regions of algorithm space. Whenever phase transitions occur, the sequence of appearances are the same. We identify combinations of variables and operators which are useful in measuring experience or performance in rule extraction and can thus implement useful annealing of the learning schedule.Comment: 11 pages, 11 figures, 2 table

    Formation and destruction of polycyclic aromatic hydrocarbon clusters in the interstellar medium

    Get PDF
    The competition between the formation and destruction of coronene clusters under interstellar conditions is investigated theoretically. The unimolecular nucleation of neutral clusters is simulated with an atomic model combining an explicit classical force field and a quantum tight-binding approach. Evaporation rates are calculated in the framework of the phase space theory and are inserted in an infrared emission model and compared with the growth rate constants. It is found that, in interstellar conditions, most collisions lead to cluster growth. The time evolution of small clusters (containing up to 312 carbon atoms) was specifically investigated under the physical conditions of the northern photodissociation region of NGC 7023. These clusters are found to be thermally photoevaporated much faster than they are reformed, thus providing an interpretation for the lowest limit of the interstellar cluster size distribution inferred from observations. The effects of ionizing the clusters and density heterogeneities are also considered. Based on our results, the possibility that PAH clusters could be formed in PDRs is critically discussed.Comment: 14 pages, 14 figures. Astronomy & Astrophysics, accepted for publicatio

    Perceptron capacity revisited: classification ability for correlated patterns

    Full text link
    In this paper, we address the problem of how many randomly labeled patterns can be correctly classified by a single-layer perceptron when the patterns are correlated with each other. In order to solve this problem, two analytical schemes are developed based on the replica method and Thouless-Anderson-Palmer (TAP) approach by utilizing an integral formula concerning random rectangular matrices. The validity and relevance of the developed methodologies are shown for one known result and two example problems. A message-passing algorithm to perform the TAP scheme is also presented

    Roadmaps to Utopia: Tales of the Smart City

    No full text
    Notions of the Smart City are pervasive in urban development discourses. Various frameworks for the development of smart cities, often conceptualized as roadmaps, make a number of implicit claims about how smart city projects proceed but the legitimacy of those claims is unclear. This paper begins to address this gap in knowledge. We explore the development of a smart transport application, MotionMap, in the context of a £16M smart city programme taking place in Milton Keynes, UK. We examine how the idealized smart city narrative was locally inflected, and discuss the differences between the narrative and the processes and outcomes observed in Milton Keynes. The research shows that the vision of data-driven efficiency outlined in the roadmaps is not universally compelling, and that different approaches to the sensing and optimization of urban flows have potential for empowering or disempowering different actors. Roadmaps tend to emphasize the importance of delivering quick practical results. However, the benefits observed in Milton Keynes did not come from quick technical fixes but from a smart city narrative that reinforced existing city branding, mobilizing a growing network of actors towards the development of a smart region. Further research is needed to investigate this and other smart city developments, the significance of different smart city narratives, and how power relationships are reinforced and constructed through them

    A Q-Ising model application for linear-time image segmentation

    Full text link
    A computational method is presented which efficiently segments digital grayscale images by directly applying the Q-state Ising (or Potts) model. Since the Potts model was first proposed in 1952, physicists have studied lattice models to gain deep insights into magnetism and other disordered systems. For some time, researchers have realized that digital images may be modeled in much the same way as these physical systems (i.e., as a square lattice of numerical values). A major drawback in using Potts model methods for image segmentation is that, with conventional methods, it processes in exponential time. Advances have been made via certain approximations to reduce the segmentation process to power-law time. However, in many applications (such as for sonar imagery), real-time processing requires much greater efficiency. This article contains a description of an energy minimization technique that applies four Potts (Q-Ising) models directly to the image and processes in linear time. The result is analogous to partitioning the system into regions of four classes of magnetism. This direct Potts segmentation technique is demonstrated on photographic, medical, and acoustic images.Comment: 7 pages, 8 figures, revtex, uses subfigure.sty. Central European Journal of Physics, in press (2010

    Parallel strategy for optimal learning in perceptrons

    Get PDF
    We developed a parallel strategy for learning optimally specific realizable rules by perceptrons, in an online learning scenario. Our result is a generalization of the Caticha–Kinouchi (CK) algorithm developed for learning a perceptron with a synaptic vector drawn from a uniform distribution over the N-dimensional sphere, so called the typical case. Our method outperforms the CK algorithm in almost all possible situations, failing only in a denumerable set of cases. The algorithm is optimal in the sense that it saturates Bayesian bounds when it succeeds
    corecore