6,496 research outputs found
Rethinking authenticity in digital art preservation
In this paper I am discussing the repositioning of traditional
conservation concepts of historicity, authenticity and versioning
in relation to born digital artworks, upon findings from my
research on preservation of computer-based artifacts. Challenges
for digital art preservation and previous work in this area are
described, followed by an analysis of digital art as a process of
components interaction, as performance and in terms of
instantiations. The concept of dynamic authenticity is proposed,
and it is argued that our approach to digital artworks preservation
should be variable and digital object responsive, with a level of
variability tolerance to match digital art intrinsic variability and
dynamic authenticity
Preservation through access: the AHDS performing arts collections in ECLAP and Europeana
This poster provides an overview of the ongoing rescue of
valuable digital collections that had been taken down and
consequently lost to general access.
The University of Glasgow was home to the Arts and Humanities
Data Service Performing Arts (AHDS Performing Arts) [1], one
of the five arts and humanities data centres that constitute the Arts
and Humanities Data Service (AHDS). Since 1996 AHDS
supported the creation, curation, preservation and reuse of digital
materials for the UK Arts and Humanities research and teaching
community. AHDS Performing Arts, based in Glasgow, supported
research, learning and teaching in music, dance, theatre, radio,
film, television, and performance for thirteen years. Working with
the AHDS Executive, relevant performing arts collections have
been ingested, documented, preserved, and where possible made
available via the AHDS Cross Search Catalogue and Website to
researchers, practitioners, and the general public. Furthermore
strong relationships were developed with research and teaching
community upon a scoping study investigating user needs [2].
In 2007 the co-funders of the AHDS - Arts and Humanities
Research Council (AHRC) for the UK and the Joint Information
Systems Committee (JISC) - withdrew their funding. A detailed
risk assessment report was produced in response to the
withdrawal of core funding [3], but to no avail. When the AHDS
funding stopped, online access to these cultural resources
eventually became discontinued [4].
In 2010, the School of Culture and Creative Arts at the University
of Glasgow joined the EU-funded ECLAP project to ensure that at
least part of these resources could be accessible for the long term
by scholars and practitioners in the performing arts arena, and by
the general public. Below we briefly describe the ECLAP project,
the AHDS Performing Arts collections progressively available
through it and some thoughts on providing preservation through
access for this type of digital cultural resources
Bringing self assessment home: repository profiling and key lines of enquiry within DRAMBORA
Digital repositories are a manifestation of complex organizational, financial, legal, technological, procedural, and political interrelationships. Accompanying each of these are innate uncertainties, exacerbated by the relative immaturity of understanding prevalent within the digital preservation domain. Recent efforts have sought to identify core characteristics that must be demonstrable by successful digital repositories, expressed in the form of check-list documents, intended to support the processes of repository accreditation and certification. In isolation though, the available guidelines lack practical applicability; confusion over evidential requirements and difficulties associated with the diversity that exists among repositories (in terms of mandate, available resources, supported content and legal context) are particularly problematic. A gap exists between the available criteria and the ways and extent to which conformity can be demonstrated. The Digital Repository Audit Method Based on Risk Assessment (DRAMBORA) is a methodology for undertaking repository self assessment, developed jointly by the Digital Curation Centre (DCC) and DigitalPreservationEurope (DPE). DRAMBORA requires repositories to expose their organization, policies and infrastructures to rigorous scrutiny through a series of highly structured exercises, enabling them to build a comprehensive registry of their most pertinent risks, arranged into a structure that facilitates effective management. It draws on experiences accumulated throughout 18 evaluative pilot assessments undertaken in an internationally diverse selection of repositories, digital libraries and data centres (including institutions and services such as the UK National Digital Archive of Datasets, the National Archives of Scotland, Gallica at the National Library of France and the CERN Document Server). Other organizations, such as the British Library, have been using sections of DRAMBORA within their own risk assessment procedures.
Despite the attractive benefits of a bottom up approach, there are implicit challenges posed by neglecting a more objective perspective. Following a sustained period of pilot audits undertaken by DPE, DCC and the DELOS Digital Preservation Cluster aimed at evaluating DRAMBORA, it was stated that had respective project members not been present to facilitate each assessment, and contribute their objective, external perspectives, the results may have been less useful. Consequently, DRAMBORA has developed in a number of ways, to enable knowledge transfer from the responses of comparable repositories, and incorporate more opportunities for structured question sets, or key lines of enquiry, that provoke more comprehensive awareness of the applicability of particular threats and opportunities
Assessing digital preservation frameworks: the approach of the SHAMAN project
How can we deliver infrastructure capable of supporting the
preservation of digital objects, as well as the services that can be applied to those digital objects, in ways that future unknown systems will understand? A critical problem in developing systems is the process of validating whether the delivered solution effectively reflects the validated requirements. This is a challenge also for the EU-funded SHAMAN project, which aims to develop an integrated preservation framework using grid-technologies for distributed networks of digital preservation systems, for managing the storage, access, presentation, and manipulation of digital objects over time. Recognising this, the project team ensured that alongside the user requirements an assessment framework was developed. This paper presents the assessment of the SHAMAN demonstrators for the memory institution, industrial design and engineering and eScience domains, from the point of view of
user’s needs and fitness for purpose. An innovative synergistic use of TRAC criteria, DRAMBORA risk registry and mitigation strategies, iRODS rules and information system models requirements has been designed, with the underlying goal to define associated policies, rules and state information, and make them wherever possible machine-encodable and enforceable. The described assessment framework can be valuable not only for the implementers of this project preservation framework, but for the wider digital preservation community, because it provides a
holistic approach to assessing and validating the preservation of digital libraries, digital repositories and data centres
Raman spectroscopy study of the interface structure in (CaCuO2)n/(SrTiO3)m superlattices
Raman spectra of CaCuO2/SrTiO3 superlattices show clear spectroscopic marker
of two structures formed in CaCuO2 at the interface with SrTiO3. For
non-superconducting superlattices, grown in low oxidizing atmosphere, the 425
cm-1 frequency of oxygen vibration in CuO2 planes is the same as for CCO films
with infinite layer structure (planar Cu-O coordination). For superconducting
superlattices grown in highly oxidizing atmosphere, a 60 cm-1 frequency shift
to lower energy occurs. This is ascribed to a change from planar to pyramidal
Cu-O coordination because of oxygen incorporation at the interface. Raman
spectroscopy proves to be a powerful tool for interface structure
investigation
DesignNet: a online knowledge gateway for industrial design education and research activities
This paper presents DesignNet, a knowledge-based system to the online digital display, retrieval and archiving of rich media resources for industrial design education and
research. It addresses the needs of end-users (teachers, researchers and students) and content providers interacting with the School of Design of the Politecnico di Milano. The project moves from the assumption that traditional modalities of archiving and presentation currently adopted by the Politecnico and other academic institutions are not coherent
with industrial design process and its need of project-support materials. The typical outputs
of industrial design process are 3D models or 2D graphics, not just texts or simple
images, the materials for which the usual method and technique of archiving and retrieval
are conceived and developed. The challenges, philosophy and methodology in creating
this evolving Web-based, cataloguing, multimedia knowledge-base to VR design resources
are discussed. Finally, the related system and prototype are described
Renormalization of Coulomb interactions in s-wave superconductor NaCoO
We study the renormalized Coulomb interactions due to retardation effect in
NaCoO. Although the Morel-Anderson's pseudo potential for
orbital is relatively large because the direct Coulomb repulsion
is large, that for interband transition between and
orbitals is very small since the renormalization factor for
pair hopping is square of that for . Therefore, the s-wave
superconductivity due to valence-band Suhl-Kondo mechanism will survive against
strong Coulomb interactions. The interband hopping of Cooper pairs due to shear
phonons is essential to understand the superconductivity in NaCoO.Comment: 2pages, 2figures, Proceedings of ICM in Kyoto, 200
Experimental determination of the frequency and field dependence of Specific Loss Power in Magnetic Fluid Hyperthermia
Magnetic nanoparticles are promising systems for biomedical applications and
in particular for Magnetic Fluid Hyperthermia, a promising therapy that
utilizes the heat released by such systems to damage tumor cells. We present an
experimental study of the physical properties that influences the capability of
heat release, i.e. the Specific Loss Power, SLP, of three biocompatible
ferrofluid samples having a magnetic core of maghemite with different core
diameter d= 10.2, 14.6 and 19.7 nm. The SLP was measured as a function of
frequency f and intensity of the applied alternating magnetic field H, and it
turned out to depend on the core diameter, as expected. The results allowed us
to highlight experimentally that the physical mechanism responsible for the
heating is size-dependent and to establish, at applied constant frequency, the
phenomenological functional relationship SLP=cH^x, with 2<x<3 for all samples.
The x-value depends on sample size and field frequency/ intensity, here chosen
in the typical range of operating magnetic hyperthermia devices. For the
smallest sample, the effective relaxation time Teff=19.5 ns obtained from SLP
data is in agreement with the value estimated from magnetization data, thus
confirming the validity of the Linear Response Theory model for this system at
properly chosen field intensity and frequency
Lagrangian filtered density function for LES-based stochastic modelling of turbulent dispersed flows
The Eulerian-Lagrangian approach based on Large-Eddy Simulation (LES) is one
of the most promising and viable numerical tools to study turbulent dispersed
flows when the computational cost of Direct Numerical Simulation (DNS) becomes
too expensive. The applicability of this approach is however limited if the
effects of the Sub-Grid Scales (SGS) of the flow on particle dynamics are
neglected. In this paper, we propose to take these effects into account by
means of a Lagrangian stochastic SGS model for the equations of particle
motion. The model extends to particle-laden flows the velocity-filtered density
function method originally developed for reactive flows. The underlying
filtered density function is simulated through a Lagrangian Monte Carlo
procedure that solves for a set of Stochastic Differential Equations (SDEs)
along individual particle trajectories. The resulting model is tested for the
reference case of turbulent channel flow, using a hybrid algorithm in which the
fluid velocity field is provided by LES and then used to advance the SDEs in
time. The model consistency is assessed in the limit of particles with zero
inertia, when "duplicate fields" are available from both the Eulerian LES and
the Lagrangian tracking. Tests with inertial particles were performed to
examine the capability of the model to capture particle preferential
concentration and near-wall segregation. Upon comparison with DNS-based
statistics, our results show improved accuracy and considerably reduced errors
with respect to the case in which no SGS model is used in the equations of
particle motion
- …
