538 research outputs found
Identity: Philosophy or Science?
Recently there has been renewed interest in psycho-neural identity theory. This is in large part due to Heuristic Identity Theory, which brings some new insights into the relation between psychology and neuroscience. Perhaps even more significant is its concept of hypothetical identity that is positioned by McCauley & Bechtel to eclipse classical theories on psycho-neural identities in virtue of its relevance to scientific practice, specifically inter-level contexts. McCauley & Bechtel claim that in addition to providing an accurate representation of the practices of science Heuristic Identity Theory also answers some philosophical objections directed at classical identity theory. The correlation objection states there is no conceivable observation that could confirm/refute an identity but not the associated correlation. In this paper, I compare the classical psycho-neural identity theories of J.C.C. Smart and U.T. Place to Heuristic Identity Theory through their relation to the correlation objection. I aim to clarify the distinction between the two kinds of identity theory, one being philosophical while the other a method of science. Through this I will show that the correlation objection is not directed at Heuristic Identity Theory and therefore McCauley & Bechtel do not appropriately answer the objection
Novel Predictions: From Empiricism to Unificationism
From Fresnel’s wave theory of light to Einstein’s general theory of relativity, the use of novel predictions has a long history in the method of science. Since predictions concern empirical matters, associated models of the method are usually empiricist ones. However, much of the recent philosophy of science shows a lack of emphasis on novel predictions. The central reasons include the general thesis of underdetermination of theory by evidence and the marginalization of novelty to a narrower issue in theory assessment, the prediction-accommodation distinction. In particular, novelty has become nothing more than code for methods classified as unificationist criteria of theory assessment. In this paper, I will extend Harker’s criticisms to a broader history of novel predictions in philosophy of science. I will then suggest a philosophy of science rooted in empiricist ideas, new experimentalism, to recontextualize novel predictions and their epistemological role. I do so in hopes rehabilitating novel predictions as the core of empirical methods. The literature known as new experimentalism offers a particularly promising context for attempting this because it concerns itself with empirical progress as analyzed through the many epistemic values of experiments. The guiding theme of new experimentalism is the theory-independence of experimental phenomena. Thus, it bypasses all unificatory methods and extra-empirical tools of theoretical science. Through this, I aim to provide a basic characterization of novel prediction as experimental interventions, contrary to its current unificationist formulations
Recommended from our members
Results of the Security in ActiveX Workshop, Pittsburgh, Pennsylvania USA, August 22-23, 2000
Integration of Deep Computer Vision Foundation Models for Document Interpretation and Anonymisation
Visual Document Understanding (VDU) models, combined with Optical Character Recognition (OCR) or OCR-free, offer businesses and institutions a great opportunity to digitalise their processes and improve workflows. The digitalisation is progressing. However, challenges like sufficient knowhow to integrate VDU models, compliance with data protection regulations and identifying the processes, where VDU models offer the most significant benefit, have to be resolved.
The main goal of the work is to analyse and evaluate the practicality and appropriateness of available VDU models for processing of documents (e.g. PDF of scanned documents) and to demonstrate these in a Proof-of-Concept (POC) application. Even though some regulatory aspects, especially regarding anonymisation, are discussed in the work, the developed application does not aspire to be regulatory compliant.
During this work, two areas have been identified, where a tool to extract text from an image, identify relevant entities of personal information and anonymise these, is beneficial. First, the anonymisation of medical documents makes more data available for research and educational purposes. A second application is data leakage prevention, where detecting client data from screenshots would lower the risk of data breaches.
Various tools exist to extract text from an image. In the scope of this project, three tools have been integrated i.e., Tesseract, Amazon Textract and OpenAI GPT-4V(ison). The application extracts the text of uploaded documents or images and provides the user with the resulting text from all three tools. The user will be able to select the text with the best quality. Afterwards, a Named Entity Recognition (NER) Transformer model (i.e., bert-base-NER model) is used to identify the names of persons in the extracted text. The last step is the pseudonymisation of the entities. A randomly generated unique string replaces the entities in the text, so that a person cannot be identified based on the name in the text.
Another feature of the application is the evaluation of the OCR accuracy. The user is able to upload an additional ground truth file, which will then be compared with the output of the uploaded images. To calculate the OCR accuracy the Jaro Similarity string comparison algorithm is used. Furthermore, the NER model can also be tested by uploading the expected entities of the document in a separate file. The test will then show how many of the provided entities have been found in the extracted text.
It is impressive how powerful today's text extraction and NER models have become. However, during the work, it was recognised that they are not yet off-the-shelf and just ready to use. Neither works each tool perfectly, so errors are propagated to subsequent processes nor are the outputs of each tool standardised. To overcome such limitations, the process of text extraction and entity recognition should be executed by one model, which is also fine-tuned on the specific document types
Broadband, Non-destructive Characterisation of PEC-backed Materials Using a Dual-ridged-waveguide Probe
A new probe which utilises a dual-ridged waveguide to provide broadband, non-destructive (ND) material characterisation measurements of a perfect electric conductor (PEC)-backed material is introduced. The new probe possesses a bandwidth similar to existing coaxial probes and is structurally robust like rectangular waveguide probes. The combinations of these two qualities make it especially attractive for ND inspection/evaluation applications in the field. The theoretical development of the dual-ridged-waveguide probe is discussed. A magnetic field integral equation is derived by applying Love’s equivalence theorem and enforcing the continuity of transverse fields at the dual-ridged-waveguide aperture. The magnetic field integral equation is then solved for the theoretical reflection coefficient using the method of moments. The permittivity and permeability of the material under test are found by minimising the root-mean-square difference between the theoretical and measured reflection coefficients using non-linear least squares. To validate the new probe, experimental results are presented of a magnetic absorbing material comparing results obtained using the new probe with those obtained using a traditional, destructive technique. The probe’s sensitivity to sample thickness, flange-plate thickness, cutoff wavenumber and measured S-parameter uncertainties is also investigated
A Clamped Dual-Ridged Waveguide Measurement System for the Broadband, Nondestructive Characterization of Sheet Materials
A novel two-port probe which uses dual-ridged waveguides for the nondestructive, broadband characterization of sheet materials is presented. The new probe is shown to possess approximately 2 to 3 times the bandwidth of traditional coaxial and rectangular/circular waveguide probe systems while maintaining the structural robustness characteristic of rectangular/circular waveguide probe systems. The theoretical development of the probe is presented, namely, by applying Love’s equivalence theorem and enforcing the continuity of transverse fields at the dual-ridged waveguide apertures, a system of coupled magnetic field integral equations is derived. The system of coupled magnetic field integral equations is solved using the method of moments to yield theoretical expressions for the reflection and transmission coefficients. The complex permittivity and permeability of the unknown material under test are then found by minimizing the root-mean-square difference between the theoretical and measured reflection and transmission coefficients. Experimental results of two magnetic absorbing materials are presented to validate the new probe. The probe’s sensitivity to measured scattering parameter, sample thickness, and flange-plate thickness errors is also investigated
- …
