1,278 research outputs found
Microalgae cultivation for lipids and carbohydrates production
Microalgae are photoautotrophic microorganisms that can produce energy both by using sunlight, water and CO2 (phototrophic metabolism) and by using organic sources such as glucose (heterotrophic metabolism). Heterotrophic growth is a key factor in microalgae research, due to its increased productivity and the lower capital and operative costs compared to photoautotrophic growth in photobioreactors. Carbohydrate production from microalgae is usually investigated for the production of biofuels (e.g. bioethanol) by successive fermentation, but also other applications can be envisaged in biopolymers. In this work an increment in carbohydrate purity after lipid extraction was found. Protein hydrolysis for different microalgae strains (Scenedesmus sp. and Chlorella sp.) was investigated. Microalgae were cultivated under photoautotrophic or heterotrophic conditions, collecting biomass at the end of the growth. Biomass samples were dried or freeze dried and used for carbohydrate and lipid extraction tests. Lipid extraction was achieved using different organic solvents (methanol-chloroform and hexane-2propanol). Basic protein hydrolysis has been carried out testing different temperatures and NaOH concentrations values. Lipids were spectrophotometrically quantified, while residual biomass was saccharificated and the total amount of sugars was measured. Significant differences about the purity of extracted carbohydrates were found comparing dried with freeze dried biomass. However, not a very promising purification of carbohydrates was achieved after protein hydrolysis, asking for further analysis. © Copyright 2017, AIDIC Servizi S.r.l
A demonstrator for bolometric interferometry
Bolometric Interferometry (BI) is one of the most promising techniques for
precise measurements of the Cosmic Microwave Background polarization. In this
paper, we present the results of DIBO (Demonstrateur d'Interferometrie
Bolometrique), a single-baseline demonstrator operating at 90 GHz, built to
proof the validity of the BI concept applied to a millimeter-wave
interferometer. This instrument has been characterized in the laboratory with a
detector at room temperature and with a 4 K bolometer. This allowed us to
measure interference patterns in a clean way, both (1) rotating the source and
(2) varying with time the phase shift among the two interferometer's arms.
Detailed modelisation has also been performed and validated with measurements.Comment: 15 pages, 14 figure
There's plenty of light at the bottom: Statistics of photon penetration depth in random media
We propose a comprehensive statistical approach describing the penetration depth of light in random media. The presented theory exploits the concept of probability density function f(z|ρ, t) for the maximum depth reached by the photons that are eventually re-emitted from the surface of the medium at distance ρ and time t. Analytical formulas for f, for the mean maximum depth 〈zmax〉 and for the mean average depth 〈z〉 reached by the detected photons at the surface of a diffusive slab are derived within the framework of the diffusion approximation to the radiative transfer equation, both in the time domain and the continuous wave domain. Validation of the theory by means of comparisons with Monte Carlo simulations is also presented. The results are of interest for many research fields such as biomedical optics, advanced microscopy and disordered photonics
New frontiers in time-domain diffuse optics, a review
The recent developments in time-domain diffuse optics that rely on physical concepts (e.g., time-gating and null distance) and advanced photonic components (e.g., vertical cavity source-emitting laser as light sources, single photon avalanche diode, and silicon photomultipliers as detectors, fast-gating circuits, and time-to-digital converters for acquisition) are focused. This study shows how these tools could lead on one hand to compact and wearable time-domain devices for point-of-care diagnostics down to the consumer level and on the other hand to powerful systems with exceptional depth penetration and sensitivity
A congenital anterior urethrocutaneous fistula in a boy whose mother was exposed to ionizing radiations: a case report and literature review
Anterior congenital urethrocutaneous fistula is a rare anomaly that may present in an isolated fashion or in association with other anomalies of the genital urinary tract or anorectal malformations. A case of congenital anterior urethrocutaneous fistula nonassociated with other congenital anomalies in a 3-year-old male whose mother has been exposed to Chernobyl's nuclear fallout is described. The patient was successfully operated with no recurrence. We report a review of the literature about etiology and surgical strategy including the role of ionizing radiations. The congenital anterior urethrocutaneous fistula represents a rare malformation. The etiopathogenesis is unknown
Optimization Under Uncertainty: Applications to Machine Learning and Waste Management
In this thesis, we deal with optimization problems affected by uncertainty. The first class of problems we analyze aims at separating sets of data points by means of linear and nonlinear classifiers. The classification task is performed according to variants of the Support Vector Machine (SVM) and the uncertainty in real-world data is handled by means of Robust Optimization (RO) techniques. In the case of binary classification, we start by formulating a novel SVM-type model with nonlinear classifiers and perfectly known data points. Secondly, to prevent low accuracies in the classification process due to data perturbations, we construct bounded-by-norm uncertainty sets around the samples. Then, we derive the robust counterpart of the deterministic model thanks to RO strategies. To tackle the problem of multiclass classification, we design a new multiclass Twin Parametric Margin SVM (TPMSVM). We consider the cases of both linear and kernel-induced boundaries and propose two alternatives for the final decision function. Data perturbations are then included in the model and RO techniques are applied to prevent the TPMSVM against the worst possible realization of the uncertainty. All the aforementioned approaches are tested on real-world datasets, showing the advantages of explicitly considering the uncertainty versus deterministic approaches. The second problem we analyze is related to waste collection. Within this application, uncertainty lies in the waste accumulation rate of the network bins. Since information on the empirical distribution of the uncertainty is available, Stochastic Optimization (SO) techniques are applied. We model the waste collection problem as a multi-stage stochastic inventory routing problem, where the decisions are related to the selection of bins to be visited and the corresponding visiting sequence in a predefined time horizon. Given the computational complexity of the model, we solve it through a rolling horizon heuristic approach, and carry out computational experiments on real-data instances. The impact of stochasticity on waste generation is examined through stochastic measures, and the performance of the rolling horizon approach is evaluated. Finally, we discuss some managerial insights.In this thesis, we deal with optimization problems affected by uncertainty. The first class of problems we analyze aims at separating sets of data points by means of linear and nonlinear classifiers. The classification task is performed according to variants of the Support Vector Machine (SVM) and the uncertainty in real-world data is handled by means of Robust Optimization (RO) techniques. In the case of binary classification, we start by formulating a novel SVM-type model with nonlinear classifiers and perfectly known data points. Secondly, to prevent low accuracies in the classification process due to data perturbations, we construct bounded-by-norm uncertainty sets around the samples. Then, we derive the robust counterpart of the deterministic model thanks to RO strategies. To tackle the problem of multiclass classification, we design a new multiclass Twin Parametric Margin SVM (TPMSVM). We consider the cases of both linear and kernel-induced boundaries and propose two alternatives for the final decision function. Data perturbations are then included in the model and RO techniques are applied to prevent the TPMSVM against the worst possible realization of the uncertainty. All the aforementioned approaches are tested on real-world datasets, showing the advantages of explicitly considering the uncertainty versus deterministic approaches. The second problem we analyze is related to waste collection. Within this application, uncertainty lies in the waste accumulation rate of the network bins. Since information on the empirical distribution of the uncertainty is available, Stochastic Optimization (SO) techniques are applied. We model the waste collection problem as a multi-stage stochastic inventory routing problem, where the decisions are related to the selection of bins to be visited and the corresponding visiting sequence in a predefined time horizon. Given the computational complexity of the model, we solve it through a rolling horizon heuristic approach, and carry out computational experiments on real-data instances. The impact of stochasticity on waste generation is examined through stochastic measures, and the performance of the rolling horizon approach is evaluated. Finally, we discuss some managerial insights
Transient currents in HfO2 and their impact on circuit and memory applications
We investigate transient currents in HfO2 dielectrics, considering their dependence on electric field, temperature and gate stack composition. We show that transient currents remain an issue even at very low temperatures and irrespective of the HfO2/SiO2 bilayer properties. Finally, we assess their impact on the reliability of precision circuit and memory applications
Transient currents in HfO2 and their impact on circuit and memory applications (PDF Download Available). Available from: http://www.researchgate.net/publication/224672970_Transient_currents_in_HfO2_and_their_impact_on_circuit_and_memory_applications [accessed Oct 22, 2015]
Preliminary characterization of an expanding flow of siloxane vapor MDM
The early experimental results on the characterization of expanding flows of siloxane vapor MDM (C8H24O2Si3, octamethyltrisiloxane) are presented. The measurements were performed on the Test Rig for Organic VApors (TROVA) at the CREA Laboratory of Politecnico di Milano. The TROVA test-rig was built in order to investigate the non-ideal compressible-fluid behavior of typical expanding flows occurring within organic Rankine cycles (ORC) turbine passages. The test rig implements a batch Rankine cycle where a planar converging-diverging nozzle replaces the turbine and represents a test section. Investigations related to both fields of non-ideal compressible-fluid dynamics fundamentals and turbomachinery are allowed. The nozzle can be operated with different working fluids and operating conditions aiming at measuring independently the pressure, the temperature and the velocity field and thus providing data to verify the thermo-fluid dynamic models adopted to predict the behavior of these flows. The limiting values of pressure and temperature are 50 bar and 400 °C respectively. The early measurements are performed along the nozzle axis, where an isentropic process is expected to occur. In particular, the results reported here refer to the nozzle operated in adapted conditions using the siloxane vapor MDM as working fluid in thermodynamic regions where mild to medium non-ideal compressible-fluid effects are present. Both total temperature and total pressure of the nozzle are measured upstream of the test section, while static pressure are measured along the nozzle axis. Schlieren visualizations are also carried out in order to complement the pressure measurement with information about the 2D density gradient field. The Laser Doppler Velocimetry technique is planned to be used in the future for velocity measurements. The measured flow field has also been interpreted by resorting to the quasi-one-dimensional theory and two dimensional CFD viscous calculation. In both cases state-of-the-art thermodynamic models were applied
- …
