265 research outputs found
In vitro cryopreservation of date palm caulogenic meristems
Cryopreservation is the technology of choice not only for plant genetic resource preservation but also for virus eradication and for the efficient management of large-scale micropropagation. In this chapter, we describe three cryopreservation protocols (standard vitrification, droplet vitrification, and encapsulation vitrification) for date palm highly proliferating meristems that are initiated from vitro-cultures using plant growth regulator-free MS medium. The positive impact of sucrose preculture and cold hardening treatments on survival rates is significant. Regeneration rates obtained with standard vitrification, encapsulation-vitrification, and droplet-vitrification protocols can reach 30, 40, and 70%, respectively. All regenerated plants from non-cryopreserved or cryopreserved explants don't show morphological variation by maintaining genetic integrity without adverse effect of cryogenic treatment. Cryopreservation of date palm vitro-cultures enables commercial tissue culture laboratories to move to large-scale propagation from cryopreserved cell lines producing true-to-type plants after clonal field-testing trials. When comparing the cost of cryostorage and in-field conservation of date palm cultivars, tissue cryopreservation is the most cost-effective. Moreover, many of the risks linked to field conservation like erosion due to climatic, edaphic, and phytopathologic constraints are circumvented. (Résumé d'auteur
Blind equalization and automatic modulation classification based on PDF fitting
International audienceIn this paper, a completely blind equalizer based on probability density function (pdf) fitting is proposed. It doesn't require any prior information about the transmission channel or the emitted constellation. We also investigate Automatic Modulation Classification (AMC) for Quadrature Amplitude Modulation (QAM) based on the pdf of the equalized signal. We propose three new approaches for AMC. The first employs maximum likelihood functions (ML) of the modulus of real and imaginary parts of the equalized signal. The second is based on the lowest quadratic or Bhattacharyya distance between the estimated pdf of the real and imaginary parts of the equalizer output and the theoretical pdfs of M-QAM modulations. The third approach is based on theoretical pdf dictionnary learning. The performance of the identification scheme is investigated through simulations
Blind equalization based on pdf distance criteria and performance analysis
In this report, we address M-QAM blind equalization by fitting the probability density functions (pdf) of the equalizer output with the constellation symbols. We propose two new cost functions, based on kernel pdf approximation, which force the pdf at the equalizer output to match the known constellation pdf. The kernel bandwidth of a Parzen estimator is updated during iterations to improve the convergence speed and to decrease the residual error of the algorithms. Unlike related existing techniques, the new algorithms measure the distance error between observed and assumed pdfs for the real and imaginary parts of the equalizer output separately. The advantage of proceeding this way is that the distributions show less modes, which facilitates equalizer convergence, while as for multi-modulus methods phase recovery keeps being preserved. The proposed approaches outperform CMA and classical pdf fitting methods in terms of convergence speed and residual error. We also analyse the convergence properties of the most efficient proposed equalizer via the ordinary differential equation (ODE) method
An adaptive radius blind equalization algorithm based on pdf fitting
International audienceIn this paper, we investigate blind equalization techniques based on information theoretic criteria. They involve estimating the probability density function (pdf) of transmitted data. Our work is based on previous studies where the Parzen window method has been used to estimate the pdf at the equalizer output. The equalizer is obtained by minimizing the distance between this equalized pdf and some target distribution. With a view to reduce algorithm complexity, we propose a reduced constellation implementation of the adaptive equalizer. We show complexity and performance gain against similar approaches in the literature
The law review paper between the Kingdom of the law and the realms of academia: A systemic functional analysis of adverbial clauses
Legal discourse has long been classified among those genres that defy generic changes the most (Gocić 2012). Recently, however, hybrid legal genres have been challenging this generic stability by imposing their own norms to coin a novel kind of ‘legal culture’ (Goźdź-Roszkowski 2011: 11). The law review article is a case in point for it combines both legal and academic standards of writing which make it “far richer in intertextuality and interdiscursivity” (Bhatia 2006: 6) than the traditional set of legal genres. This generic subversion can be traced in the lexico-grammatical choices made by the authors to turn their papers into influential legal sources rather than mere descriptions of the law. In this context, this study aspires to scrutinize the use of adverbial clauses as one specific lexico-grammatical choice in a corpus of 44 accredited law review papers with the aim of showing how this hybrid genre strives to evolve beyond the stagnation of what is termed ‘language of the law’. Specifically, a Systemic Functional Linguistics analysis of the semantic, structural and thematic uses of these structures is conducted to demonstrate how the hybridity of contexts in a single genre can make for unprecedented generic breaches. The quantitative and qualitative analyses revealed an uneven distribution of adverbial patterns in favor of non-finite purpose and finite condition, concession and reason clauses. Additionally, the positional distribution of these patterns is manipulated whenever the need arises to hedge claims as a form of allegiance to the communal demands of the law and academia. These choices are found to comply with the authors’ needs to balance both legal and academic rituals of writing while observing at the same time their personal needs to be highly acclaimed as legal scholars and to “publish or perish” (Christensen & Oseid 2008: 1)
Towards More Effective Feedback Strategies to Enhance Microteaching for Pre-service Teachers at ISEAH Mahdia
This exploratory practice study examined the effectiveness of the feedback strategies currently in use at the Higher Institute of Applied Studies in the Humanities of Mahdia in relation to the practice of microteaching for Tunisian pre-service teachers. Qualitative and quantitative data from third-year students majoring in Education and Teaching were collected: 30 videotaped microteaching lessons, two in-class discussions, and teacher trainees’ responses to a survey designed to track their progress in light of the feedback they had received from their trainer and peers. The analyses revealed traceable improvement in the trainees’ understanding and performance, thereby establishing the efficacy of the current feedback strategies for enhancing the quality of students’ microteaching. 
EFSA Panel on Dietetic Products, Nutrition and Allergies (NDA); Scientific Opinion on the substantiation of a health claim related to polyphenols in olive and maintenance of normal blood HDL-cholesterol concentrations (ID 1639, further assessment) pursuant to Article 13(1) of Regulation (EC) No 1924/2006
<p>Following a request from the European Commission, the Panel on Dietetic Products, Nutrition and Allergies (NDA) was asked to provide a scientific opinion on a health claim pursuant to Article 13 of Regulation (EC) No 1924/2006 in the framework of further assessment related to polyphenols in olive and maintenance of normal blood HDL-cholesterol concentrations. The food constituent, polyphenols in olive (olive fruit, olive mill waste waters or olive oil, <em>Olea europaea</em> L. extract and leaf) standardised by their content of hydroxytyrosol and its derivatives (e.g. oleuropein complex), that is the subject of the health claim is sufficiently characterised. The claimed effect, maintenance of normal blood HDL-cholesterol concentrations, which is eligible for further assessment, is a beneficial physiological effect. The proposed target population is the general population. No evidence from which conclusions could be drawn for the scientific substantiation of the claim, in addition to the Panel’s earlier opinion, was provided. The Panel considers that no data were submitted which would require a reconsideration of the conclusions expressed in its previous opinion, in which it concluded that the evidence provided was insufficient to establish a cause and effect relationship between the consumption of olive oil polyphenols (standardised by the content of hydroxytyrosol and its derivatives) and maintenance of normal blood HDL cholesterol concentrations.</p>
Egalisation aveugle par méthodes à noyaux et techniques de classification automatique de modulations
In transmissions, multipath propagation introduces intersymbols interference (ISI) that can make it difficult to recover transmitted data. Thus, an equalizer can be used to reduce the ISI. Among the equalization techniques, blind equalization approaches have been developed to retrieve symbols transmitted through an unknown channel by only using received data and some knowledge upon the statistics of the original sequence. In the last decade, new blind equalization techniques, based on information theoretic criteria and probability density functions (pdf) estimation of transmitted data, have been proposed. These criteria consider the whole data distribution and are optimized adaptively, in general by means of stochastic gradient techniques. The objective of this thesis is to propose new blind equalization techniques, based on pdf fitting using kernel methods that are more efficient than the existing ones in terms of convergence speed and residual error. We have proposed new equalizers fulfilling these requirements and we have shown that the performance of the most powerful among proposed methods are close to those of the minimum mean square error equalizer (MMSE). Furthermore, in order to tackle the new challenges related to the construction of systems that are intelligent and able to adapt to the transmission conditions, we studied the automatic modulation classification techniques. These techniques are useful in particular for adaptive modulation or for cognitive radio systems where the receiver has no idea neither about the channel nor about the transmitted modulation. We have proposed new approaches for the classification of modulations especially for QAM and PSK modulations.Dans les communications numériques, le signal transmis de l'émetteur au récepteur subit des perturbations dues au canal de transmission qui agit comme un filtre sur le signal transmis et conduit au phénomène d'interférence inter-symboles (IIS). Afin de bien récupérer les données émises, il est nécessaire de réduire l'effet de l'IIS par des techniques de traitement de signal telles que l'égalisation. Parmi les techniques d'égalisation, l'égalisation aveugle exploite des caractéristiques statistiques du signal émis connues du récepteur et permet d'éviter l'emploi de séquences d'apprentissage, coûteuses en débit utile. Au cours de la dernière décennie, de nouvelles approches en égalisation aveugle ont été proposées qui prennent en compte l'ensemble de la distribution des données. L'objectif de cette thèse est de proposer de nouvelles techniques d'égalisation, basées sur l'ajustement de densité de probabilité par méthodes à noyaux, plus performantes que celles existantes en termes de vitesse de convergence et d'erreur résiduelle. On a proposé une nouvelle structure d'égaliseur satisfaisant à ces exigences et dont on a démontré la convergence de l'erreur quadratique en sortie vers celle de l'égaliseur non aveugle du minimum d'erreur quadratique moyenne (MMSE). Par ailleurs, en vue de répondre aux nouveaux défis liés à la construction de systèmes intelligents et capables de s'adapter aux conditions de transmission, on s'est également intéressé à l'étude des techniques de classification automatique de modulation (CAM). Ces techniques sont utiles notamment pour les systèmes de radio cognitive où le récepteur ne dispose ni de la connaissance du canal ni de celle de la modulation émise. Nous avons ainsi proposé de nouvelles approches pour la classification automatique des modulations MAQ (Modulation d'Amplitude en Quadrature) et MDP (Modulation par Déplacement de Phase)
Epidemiological Aspects of Hepatitis A: Endemicity Patterns and Molecular Epidemiology
Improvements in hygiene and socio-economic conditions in many parts of the world have led to an epidemiological shift in hepatitis A with a transition from high to low endemicity. Consequently, in these areas, higher proportion of symptomatic disease among adolescents resulting in large-scale community outbreaks has been described. In Tunisia, an increase in the average age at the time of infection has been reported, hence resulting in regular outbreaks, especially household and primary school epidemics. Molecular investigation of such outbreaks, based on the determination of viral genotype and genetic relatedness between hepatitis A virus (HAV) strains, is a useful tool to identify the potential source of HAV contamination but also to assess the virus molecular dynamics over time, such as the introduction of a new genotype or a specific clustering of HAV strains according to the geographical origin. In Sfax city, (Center-East of Tunisia), only HAV strains of genotype IA are circulating. In rural areas, HAV infection is still highly endemic with probably a water-borne transmission pattern. Nevertheless, the considerable genetic heterogeneity observed in urban areas highlights the changing pattern of hepatitis A epidemiology in these settings. Further molecular studies are strongly needed to better understand HAV epidemiology in Tunisia
- …
