605 research outputs found

    Perceptual Context in Cognitive Hierarchies

    Full text link
    Cognition does not only depend on bottom-up sensor feature abstraction, but also relies on contextual information being passed top-down. Context is higher level information that helps to predict belief states at lower levels. The main contribution of this paper is to provide a formalisation of perceptual context and its integration into a new process model for cognitive hierarchies. Several simple instantiations of a cognitive hierarchy are used to illustrate the role of context. Notably, we demonstrate the use context in a novel approach to visually track the pose of rigid objects with just a 2D camera

    Active interoceptive inference and the emotional brain

    Get PDF
    We review a recent shift in conceptions of interoception and its relationship to hierarchical inference in the brain. The notion of interoceptive inference means that bodily states are regulated by autonomic reflexes that are enslaved by descending predictions from deep generative models of our internal and external milieu. This re-conceptualization illuminates several issues in cognitive and clinical neuroscience with implications for experiences of selfhood and emotion. We first contextualize interoception in terms of active (Bayesian) inference in the brain, highlighting its enactivist (embodied) aspects. We then consider the key role of uncertainty or precision and how this might translate into neuromodulation. We next examine the implications for understanding the functional anatomy of the emotional brain, surveying recent observations on agranular cortex. Finally, we turn to theoretical issues, namely, the role of interoception in shaping a sense of embodied self and feelings. We will draw links between physiological homoeostasis and allostasis, early cybernetic ideas of predictive control and hierarchical generative models in predictive processing. The explanatory scope of interoceptive inference ranges from explanations for autism and depression, through to consciousness. We offer a brief survey of these exciting developments

    An agent-based approach for the dynamic and decentralized service reconfiguration in collaborative production scenarios

    Get PDF
    Future industrial systems endorse the implementation of innovative paradigms addressing the continuous flexibility, reconfiguration, and evolution to face the volatility of dynamic markets demanding complex and customized products. Smart manufacturing relies on the capability to adapt and evolve to face changes, particularly by identifying, on-the-fly, opportunities to reconfigure its behavior and functionalities and offer new and more adapted services. This paper introduces an agent-based approach for service reconfiguration that allows the identification of the opportunities for reconfiguration in a pro-active and dynamic manner, and the implementation on-the-fly of the best strategies for the service reconfiguration that will lead to a better production efficiency. The developed prototype for a flexible manufacturing system case study allowed to verify the feasibility of greedy local service reconfiguration for competitive and collaborative industrial automation situations.info:eu-repo/semantics/publishedVersio

    The 'law of requisite variety' may assist climate change negotiations:a review of the Kyoto and Durban meetings

    Get PDF
    Ashby wrote about cybernetics, during which discourse he described a Law that attempts to resolve difficulties arising in complex situations – he suggested using variety to combat complexity. In this paper, we note that the delegates to the UN Framework Convention on Climate Change (UNFCCC) meeting in Kyoto, 1997, were offered a ‘simplifying solution’ to cope with the complexity of discussing multiple pollutants allegedly contributing to ‘climate change’. We assert that the adoption of CO2eq has resulted in imprecise thinking regarding the ‘carbon footprint’ – that is, ‘CO2’ – to the exclusion of other pollutants. We propose, as Ashby might have done, that the CO2eq and other factors within the ‘climate change’ negotiations be disaggregated to allow careful and specific individual solutions to be agreed on each factor. We propose a new permanent and transparent ‘action group’ be in charge of agenda setting and to manage the messy annual meetings. This body would be responsible for achieving accords at these annual meetings, rather than forcing this task on national hosts. We acknowledge the task is daunting and we recommend moving on from Ashby's Law to Beer's Viable Systems approach

    Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems

    Get PDF
    A generic mechanism - networked buffering - is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems. \ud \u

    "Meaning" as a sociological concept: A review of the modeling, mapping, and simulation of the communication of knowledge and meaning

    Full text link
    The development of discursive knowledge presumes the communication of meaning as analytically different from the communication of information. Knowledge can then be considered as a meaning which makes a difference. Whereas the communication of information is studied in the information sciences and scientometrics, the communication of meaning has been central to Luhmann's attempts to make the theory of autopoiesis relevant for sociology. Analytical techniques such as semantic maps and the simulation of anticipatory systems enable us to operationalize the distinctions which Luhmann proposed as relevant to the elaboration of Husserl's "horizons of meaning" in empirical research: interactions among communications, the organization of meaning in instantiations, and the self-organization of interhuman communication in terms of symbolically generalized media such as truth, love, and power. Horizons of meaning, however, remain uncertain orders of expectations, and one should caution against reification from the meta-biological perspective of systems theory

    Evolutionary connectionism: algorithmic principles underlying the evolution of biological organisation in evo-devo, evo-eco and evolutionary transitions

    Get PDF
    The mechanisms of variation, selection and inheritance, on which evolution by natural selection depends, are not fixed over evolutionary time. Current evolutionary biology is increasingly focussed on understanding how the evolution of developmental organisations modifies the distribution of phenotypic variation, the evolution of ecological relationships modifies the selective environment, and the evolution of reproductive relationships modifies the heritability of the evolutionary unit. The major transitions in evolution, in particular, involve radical changes in developmental, ecological and reproductive organisations that instantiate variation, selection and inheritance at a higher level of biological organisation. However, current evolutionary theory is poorly equipped to describe how these organisations change over evolutionary time and especially how that results in adaptive complexes at successive scales of organisation (the key problem is that evolution is self-referential, i.e. the products of evolution change the parameters of the evolutionary process). Here we first reinterpret the central open questions in these domains from a perspective that emphasises the common underlying themes. We then synthesise the findings from a developing body of work that is building a new theoretical approach to these questions by converting well-understood theory and results from models of cognitive learning. Specifically, connectionist models of memory and learning demonstrate how simple incremental mechanisms, adjusting the relationships between individually-simple components, can produce organisations that exhibit complex system-level behaviours and improve the adaptive capabilities of the system. We use the term “evolutionary connectionism” to recognise that, by functionally equivalent processes, natural selection acting on the relationships within and between evolutionary entities can result in organisations that produce complex system-level behaviours in evolutionary systems and modify the adaptive capabilities of natural selection over time. We review the evidence supporting the functional equivalences between the domains of learning and of evolution, and discuss the potential for this to resolve conceptual problems in our understanding of the evolution of developmental, ecological and reproductive organisations and, in particular, the major evolutionary transitions

    A Minimal Model of Metabolism Based Chemotaxis

    Get PDF
    Since the pioneering work by Julius Adler in the 1960's, bacterial chemotaxis has been predominantly studied as metabolism-independent. All available simulation models of bacterial chemotaxis endorse this assumption. Recent studies have shown, however, that many metabolism-dependent chemotactic patterns occur in bacteria. We hereby present the simplest artificial protocell model capable of performing metabolism-based chemotaxis. The model serves as a proof of concept to show how even the simplest metabolism can sustain chemotactic patterns of varying sophistication. It also reproduces a set of phenomena that have recently attracted attention on bacterial chemotaxis and provides insights about alternative mechanisms that could instantiate them. We conclude that relaxing the metabolism-independent assumption provides important theoretical advances, forces us to rethink some established pre-conceptions and may help us better understand unexplored and poorly understood aspects of bacterial chemotaxis

    A Measurement of Rb using a Double Tagging Method

    Get PDF
    The fraction of Z to bbbar events in hadronic Z decays has been measured by the OPAL experiment using the data collected at LEP between 1992 and 1995. The Z to bbbar decays were tagged using displaced secondary vertices, and high momentum electrons and muons. Systematic uncertainties were reduced by measuring the b-tagging efficiency using a double tagging technique. Efficiency correlations between opposite hemispheres of an event are small, and are well understood through comparisons between real and simulated data samples. A value of Rb = 0.2178 +- 0.0011 +- 0.0013 was obtained, where the first error is statistical and the second systematic. The uncertainty on Rc, the fraction of Z to ccbar events in hadronic Z decays, is not included in the errors. The dependence on Rc is Delta(Rb)/Rb = -0.056*Delta(Rc)/Rc where Delta(Rc) is the deviation of Rc from the value 0.172 predicted by the Standard Model. The result for Rb agrees with the value of 0.2155 +- 0.0003 predicted by the Standard Model.Comment: 42 pages, LaTeX, 14 eps figures included, submitted to European Physical Journal

    Measurement of the B+ and B-0 lifetimes and search for CP(T) violation using reconstructed secondary vertices

    Get PDF
    The lifetimes of the B+ and B-0 mesons, and their ratio, have been measured in the OPAL experiment using 2.4 million hadronic Z(0) decays recorded at LEP. Z(0) --> b (b) over bar decays were tagged using displaced secondary vertices and high momentum electrons and muons. The lifetimes were then measured using well-reconstructed charged and neutral secondary vertices selected in this tagged data sample. The results aretau(B+) = 1.643 +/- 0.037 +/- 0.025 pstau(Bo) = 1.523 +/- 0.057 +/- 0.053 pstau(B+)/tau(Bo) = 1.079 +/- 0.064 +/- 0.041,where in each case the first error is statistical and the second systematic.A larger data sample of 3.1 million hadronic Z(o) decays has been used to search for CP and CPT violating effects by comparison of inclusive b and (b) over bar hadron decays, No evidence fur such effects is seen. The CP violation parameter Re(epsilon(B)) is measured to be Re(epsilon(B)) = 0.001 +/- 0.014 +/- 0.003and the fractional difference between b and (b) over bar hadron lifetimes is measured to(Delta tau/tau)(b) = tau(b hadron) - tau((b) over bar hadron)/tau(average) = -0.001 +/- 0.012 +/- 0.008
    corecore