18 research outputs found

    Music without musicians ... but with scientists, technicians and computer companies

    Get PDF
    Dieser Beitrag ist mit Zustimmung des Rechteinhabers aufgrund einer (DFG geförderten) Allianz- bzw. Nationallizenz frei zugänglich.This publication is with permission of the rights owner freely accessible due to an Alliance licence and a national licence (funded by the DFG, German Research Foundation) respectively.In the early days of music technologies the collaboration between musicians, scientists, technicians and equipment producers was very close. How did this collaboration develop? Why did scientific, business, and musical agendas converge towards a common goal? Was there a mutual exchange of skills and expertise? To answer these questions this article will consider a case study in early computer music. It will examine the career of the Italian cellist and composer Pietro Grossi (1917–2002), who explored computer music with the support of mainframe manufacturers, industrial R&D, and scientific institutions. During the 1970s, Grossi became an eager programmer and achieved a first-hand experience of computer music, writing several software packages. Grossi was interested in avant-garde music as an opportunity to make ‘music without musicians’. He aimed at a music composed and performed by machines, and eventually, he achieved this result with his music software. However, to accomplish it, Grossi could not be a lonely pioneer; he had to become a member, albeit an atypical one, of the Italian computing community of the time. Grossi’s story, thus, can tell us much about the collaborative efforts stimulated by the use of early computer technologies in sound research, and how these efforts developed at the intersection of science, art and industry

    In pursuit of a science of agriculture: the role of statistics in field experiments

    Get PDF
    Since the beginning of the twentieth century statistics has reshaped the experimental cultures of agricultural research taking part in the subtle dialectic between the epistemic and the material that is proper to experimental systems. This transformation has become especially relevant in field trials and the paper will examine the British agricultural institution, Rothamsted Experimental Station, where statistical methods nowadays popular in the planning and analysis of field experiments were developed in the 1920s. At Rothamsted statistics promoted randomisation over systematic arrangements, factorisation over one-question trials, and emphasised the importance of the experimental error in assessing field trials. These changes in methodology transformed also the material culture of agricultural science, and a new body, the Field Plots Committee, was created to manage the field research of the agricultural institution. Although successful, the vision of field experimentation proposed by the Rothamsted statisticians was not unproblematic. Experimental scientists closely linked to the farming community questioned it in favour of a field research that could be more easily understood by farmers. The clash between the two agendas reveals how the role attributed to statistics in field experimentation defined different pursuits of agricultural research, alternately conceived of as a scientists’ science or as a farmers’ science

    Building human and industrial capacity in European biotechnology: the Yeast Genome Sequencing Project (1989–1996)

    Get PDF
    During the years 1989-1996 the European Commission took a leading role in sequencing the yeast genome. The project was completed in April 1996 and celebrated as the success of a European research strategy based on a distributed model of scientific collaboration. Almost one hundred laboratories and private companies dispersed all over Europe took part in the sequencing work sponsored by the European Commission and an industrial platform was created to facilitate the exploitation of the genomic data by companies which were interested in yeast. The yeast genome project was part of the biotechnology strategy developed by the European Commission during the 1980s and 1990s. The Commission expected biotechnology to be relevant in crucial areas of political, economic and social intervention and wanted to promote economic growth and contribute to the process of European integration by developing a community strategy in biotechnology. Due to the strong industrial value of yeast, which is used by agrofood, pharmaceutical and biotechnology companies, sequencing the genome of this microorganism proved an ideal opportunity to pursue the Commission’s plans and the paper will examine how the yeast genome project was shaped to build human and industrial capacity in European biotechnology. By investigating capacity building, it will be possible to understand why the European Commission decided to sponsor and coordinate a scientific project in genomics, but with the real aim to strengthen economic growth in the biotechnology sector and promote integration among new and old member states of the European Economic Community.EC/H2020/678757/EU/Medical translation in the history of modern genomics/Transgen

    The emergence of modern statistics in agricultural science : Analysis of variance, experimental design and the reshaping of research at Rothamsted Experimental Station, 1919–1933

    Get PDF
    During the twentieth century statistical methods have transformed research in the experimental and social sciences. Qualitative evidence has largely been replaced by quantitative results and the tools of statistical inference have helped foster a new ideal of objectivity in scientific knowledge. The paper will investigate this transformation by considering the genesis of analysis of variance and experimental design, statistical methods nowadays taught in every elementary course of statistics for the experimental and social sciences. These methods were developed by the mathematician and geneticist R. A. Fisher during the 1920s, while he was working at Rothamsted Experimental Station, where agricultural research was in turn reshaped by Fisher’s methods. Analysis of variance and experimental design required new practices and instruments in field and laboratory research, and imposed a redistribution of expertise among statisticians, experimental scientists and the farm staff. On the other hand the use of statistical methods in agricultural science called for a systematization of information management and made computing an activity integral to the experimental research done at Rothamsted, permanently integrating the statisticians’ tools and expertise into the station research programme. Fisher’s statistical methods did not remain confined within agricultural research and by the end of the 1950s they had come to stay in psychology, sociology, education, chemistry, medicine, engineering, economics, quality control, just to mention a few of the disciplines which adopted them

    From Computing Girls to Data Processors: Women Assistants in the Rothamsted Statistics Department

    Full text link

    AI-Ready Data in Biodiversity Informatics: Why Machine-Readability Matters

    No full text
    Biodiversity science relies on diverse and complex data sources to trace changes in species distribution and their causes and uses this information to help address global environmental challenges. AI tools can support in tackling these challenges, but are biodiversity data AI-ready? Only digital data that can be managed via automatic pipelines and consumed directly by a machine learning model or a deep neural network can be considered AI-ready. Therefore, making data and metadata available in a machine-readable format is the first step to make them AI-ready. For this reason, a Github repository has been created to provide an overview of suitable machine-readable data formats in biodiversity informatics, illustrate their context of application, and give advice against possible pitfalls. Both machine-readable data formats of general use (e.g., CSV) and specific to biodiversity science (e.g., DwC-A) are examined

    Charting the history of agricultural experiments

    Full text link

    Music Without Musicians: Pietro Grossi’s Experience in Electronic and Computer Music

    No full text
    Pietro Grossi (1917-2002) and the S 2F M, the electronic music studio he created in Florence, are one of the Italian experiences in electronic music listed in Hugh Davies’ International Electronic Music Catalogue. Fascinated by the new opportunities offered by technologies, Grossi, a cellist and composer, left his successful career of thirty years in the orchestra of the Maggio Musicale Fiorentino for an uncertain venture in electronic and computer music, and later also in visual art. Grossi’s choice was inspired by a radical project: to make music without musicians, to free the intellectual act of composition from the labour of the musical performance. Grossi’s interest in electronic music began with a visit to the electronic music studio set up by the Italian public broadcasting corporation (RAI) in Milan. There he created his first composition in electronic music, Progetto 2-3 (1961), based on combinatorics. Enthusiastic about this experience, in 1963 Grossi assembled in his house in Florence some basic equipment for making electronic music. The “Studio di Fonologia Musicale di Firenze” (S 2F M) was born. In 1965 the equipment moved from Grossi’s house to the Conservatorio Luigi Cherubini, the Florentine music school, where Grossi had been teaching cello for many years, and where he began to teach also a class in electronic music. The equipment became available to the composers and the students of the school interested in experimenting with the opportunities offered by electronics, but also engineers and visual artists curious about electronic music took part in Grossi’s work. Throughout the 1960s the S 2F M produced compositions, either by Grossi or by his co-workers. Many of these compositions circulated under the name of the studio, not of the single composer, because Grossi conceived electronic music as an open-ended enterprise in which the tapes produced by others could be dismembered and pieces borrowed and reassembled to create a new musical experience. In the second half of the 1960s Grossi began to experiment with computers. In the following decades computer music became the main interest of Grossi and his S 2F M. Establishing collaborations with producers of mainframes based in Italy (Olivetti-General Electric and IBM) and scientific institutions involved in computational research (in particular the Italian University Computing Centre (CNUCE) based in Pisa), Grossi had the opportunity to promote the development of both hardware and software tools for computer music. More appreciated by computer scientists and high-tech entrepreneurs than by musicians, Grossi’s work eventually turned in the 1980s to the production of visual art displays with the personal computer (Homeart). My talk will analyse Grossi’s career path from electronic to computer music. I will investigate the role that technologies – at first oscillators, filters and synthesizers, and later digital computers – played in this process and the alliances that Grossi established with technicians, scientists and hardware developers to pursue his projects. I will argue that in Grossi’s vision electronic music was the first, but incomplete step, towards the full automation of the musical performance that computers made possible. The talk will also consider Grossi’s connections with national and international experiences in electronic and computer music, and his activities for the dissemination of electronic and computer music in collaboration with the Italian public broadcasting corporation
    corecore