3,701 research outputs found
Identities in the Algebra of Partial Maps
We consider the identities of a variety of semigroup-related algebras modelling the algebra of partial maps. We show that the identities are intimately related to a weak semigroup deductive system and we show that the equational theory is decidable. We do this by giving a term rewriting system for the variety. We then show that this variety has many subvarieties whose equational theory interprets the full uniform word problem for semigroups and consequently are undecidable. As a corollary it is shown that the equational theory of Clifford semigroups whose natural order is a semilattice is undecidable
Monoids with tests and the algebra of possibly non-halting programs
We study the algebraic theory of computable functions, which can be viewed as arising from possibly non-halting computer programs or algorithms, acting on some state space, equipped with operations of composition, if-then-else and while-do defined in terms of a Boolean algebra of conditions. It has previously been shown that there is no finite axiomatisation of algebras of partial functions under these operations alone, and this holds even if one restricts attention to transformations (representing halting programs) rather than partial functions, and omits while-do from the signature. In the halting case, there is a natural “fix”, which is to allow composition of halting programs with conditions, and then the resulting algebras admit a finite axiomatisation. In the current setting such compositions are not possible, but by extending the notion of if-then-else, we are able to give finite axiomatisations of the resulting algebras of (partial) functions, with while-do in the signature if the state space is assumed finite. The axiomatisations are extended to consider the partial predicate of equality. All algebras considered turn out to be enrichments of the notion of a (one-sided) restriction semigrou
Value innovation modelling: Design thinking as a tool for business analysis and strategy
This paper explores the use of multiple perspective problem framing (English 2008) as a tool to reveal hidden value and commercial opportunity for business.
Creative thinking involves the interrelationship of parameters held open and fluid within the cognitive span of the creative mind. The recognition of new associations can create new value that can lead to innovation in designed products, intellectual property and business strategy.
The ‘Ideas-lab’ process is based on the proposition that a company’s capacity for innovation is dependent on the way the business is able to see its problems and opportunities. In this process the attributes of a company and the experience of the researchers are considered as the parameters of a design problem. It is therefore important to acknowledge the commercial experience of the project researchers, all of whom have a proven track record in helping businesses develop, exploit and protect their know how.
Semi structured interviews were carried out with key individuals in 34 companies. The resulting data was assessed on a company-by-company basis through a process of multiple
perspective problem framing, enabling key nodes, patterns and relationships to be identified and explored. A ‘Cornerstones of Innovation’ report was prepared to inform each company of the observations made by the researchers.
The paper describes the methods adopted and summarises the feedback from participating companies. Case studies are highlighted to demonstrate ways in which the process influenced the actions of particular businesses, and the commercial outcomes that resulted. Finally the researchers reflect on the structure of the Ideas-lab process
Semigroups with if-then-else and halting programs
The "if–then–else" construction is one of the most elementary programming commands, and its abstract laws have been widely studied, starting with McCarthy. Possibly, the most obvious extension of this is to include the operation of composition of programs, which gives a semigroup of functions (total, partial, or possibly general binary relations) that can be recombined using if–then–else. We show that this particular extension admits no finite complete axiomatization and instead focus on the case where composition of functions with predicates is also allowed (and we argue there is good reason to take this approach). In the case of total functions — modeling halting programs — we give a complete axiomatization for the theory in terms of a finite system of equations. We obtain a similar result when an operation of equality test and/or fixed point test is included
Partial maps with domain and range: extending Schein's representation
The semigroup of all partial maps on a set under the operation of composition admits a number of operations relating to the domain and range of a partial map. Of particular interest are the operations R and L returning the identity on the domain of a map and on the range of a map respectively. Schein [25] gave an axiomatic characterisation of the semigroups with R and L representable as systems of partial maps; the class is a finitely axiomatisable quasivariety closely related to ample semigroups (which were introduced—as type A semigroups—by Fountain, [7]). We provide an account of Schein's result (which until now appears only in Russian) and extend Schein's method to include the binary operations of intersection, of greatest common range restriction, and some unary operations relating to the set of fixed points of a partial map. Unlike the case of semigroups with R and L, a number of the possibilities can be equationally axiomatised
Light Spectrum and its Implications on Milk Production
This information was part of the August 2013 issue of Eastern DairyBusiness Magazine. The Manager, a section within the Eastern DairyBusiness Magazine, is authored and organized by the PRO-DAIRY program in the College of Agriculture and Life Sciences at Cornell University
Towards a Stock-Flow Consistent Ecological Macroeconomics
Modern western economies (in the Eurozone and elsewhere) face a number of challenges over the coming decades. Achieving full employment, meeting climate change and other key environmental targets, and reducing inequality rank amongst the highest of these. The conventional route to achieving these goals has been to pursue economic growth. But this route has created two critical problems for modern economies. The first is that higher growth leads (ceteris parabis) to higher environmental impact. The second is that fragility in financial balances has accompanied relentless demand expansion. The prevailing global response to the first problem has been to encourage a decoupling of output from impacts by investing in green technologies (green growth). But this response runs the risk of exacerbating problems associated with the over-leveraging of households, firms and governments and places undue confidence in unproven and imagined technologies. An alternative approach is to reduce the pace of growth and to restructure economies around green services (post-growth). But the potential dangers of declining growth rates lie in increased inequality and in rising unemployment. Some more fundamental arguments have also been made against the feasibility of interest-bearing debt within a post-growth economy. The work described in this paper was motivated by the need to address these fundamental dilemmas and to inform the debate that has emerged in recent years about the relative merits of green growth and post-growth scenarios. In pursuit of this aim we have developed a suite of macroeconomic models based on the methodology of Post-Keynesian Stock Flow Consistent (SFC) system dynamics. Taken together these models represent the first steps in constructing a new macroeconomic synthesis capable of exploring the economic and financial dimensions of an economy confronting resource or environmental constraints. Such an ecological macroeconomics includes an account of basic macroeconomic variables such as the GDP, consumption, investment, saving, public spending, employment, and productivity. It also accounts for the performance of the economy in terms of financial balances, net lending positions, money supply, distributional equity and financial stability. This report illustrates the utility of this new approach through a number of specific analyses and scenario explorations. These include an assessment of the Piketty hypothesis (that slow growth increases inequality), an analysis of the "growth imperative" hypothesis (that interest bearing debt requires economic growth for stability), and an analysis of the financial and monetary implications of green investment policies. The work also assesses the scope for fiscal policy to improve social and environmental outcomes
Estimating direct and indirect rebound effects for UK households
Energy efficiency improvements by households lead to rebound effects that offset the potential energy and emissions savings. Direct rebound effects result from increased demand for cheaper energy services, while indirect rebound effects result from increased
demand for other goods and services that also require energy to provide. Research to date has focused upon the former, but both are important for climate change. This study estimates the combined direct and indirect rebound effects from seven measures that improve the energy efficiency of UK dwellings. The methodology is based upon estimates of the income elasticity and greenhouse gas (GHG) intensity of 16 categories of household
goods and services, and allows for the embodied emissions of the energy efficiency measures themselves. Rebound effects are measured in GHG terms and relate to the
adoption of these measures by an average UK household. The study finds that the rebound effects from these measures are typically in the range 5-15% and arise mostly from indirect effects. This is largely because expenditure on gas and electricity is more GHG-intensive
than expenditure on other goods and services. However, the anticipated shift towards a low carbon electricity system in the UK may lead to much larger rebound effects
The WIMP Forest: Indirect Detection of a Chiral Square
The spectrum of photons arising from WIMP annihilation carries a detailed
imprint of the structure of the dark sector. In particular, loop-level
annihilations into a photon and another boson can in principle lead to a series
of lines (a WIMP forest) at energies up to the WIMP mass. A specific model
which illustrates this feature nicely is a theory of two universal extra
dimensions compactified on a chiral square. Aside from the continuum emission,
which is a generic prediction of most dark matter candidates, we find a
"forest" of prominent annihilation lines that, after convolution with the
angular resolution of current experiments, leads to a distinctive (2-bump plus
continuum) spectrum, which may be visible in the near future with the Fermi
Gamma-Ray Space Telescope (formerly known as GLAST).Comment: 11 pages, 4 figure
- …
