18 research outputs found

    SBL for Multiple Parameterized Dictionaries

    No full text
    SBL for Multiple Parameterized Dictionaries This repository contains the python-code used in [1]. Included are (i) the code for the proposed sparse Bayesian learning (SBL)-based algorithm, (ii) the code for the newtonized orthogonal matching pursuit (NOMP) algorithm [2] used for comparison, (iii) additional files such as for the generalized subpattern assignment (OSPA) metric [3]. The repositor contains three demo examples (example_.py). The script example_crossing.py reproduces Figure 2 from [1]. Repository structure: |- pySBL/pySBL.py Code for the proposed algorithm |- pyOMP/pyOMP.py Implementation of the NOMP method for comparison |- dictionary_functions.py Implementation of the (parameterized) dictionary functions |- gopsa.py Implementation of the OSPA used to evaluate the results. |- example_radar.py Demo example with a single radar comparing SBL and OMP |- example_multiradar.py Demo Example applying SBL to multiple radars \ example_crossing.py Demo example of two crossing targets (Fig. 2) The code was tested using python 3.13 and numpy 2.2.3. The full spec of the miniconda environment is found in python_env.txt References [1] Moederl J., Westerkam, A. M., Venus, A. and Leitinger, E., "A Block-Sparse Bayesian Learning Algorithm with Dictionary Parameter Estimation for Multi-Sensor Data Fusion", submitted to the IEEE 28th International Conference on Information Fusion, Rio de Janeiro, Brazil, Jul 7-11, 2025. [2] B. Mamandipoor, D. Ramasamy, and U. Madhow, "Newtonized orthogonal matching pursuit: Frequency estimation over the continuum," IEEE Trans. Signal Process., vol. 64, no. 19, pp. 5066-5081, Oct. 2016. [3] A. S. Rahmathullah, A. F. Garcia-Fernandez, and L. Svensson,"Generalized optimal sub-pattern assignment metric," in 20th Int. Conf. Inf. Fusion, Xi'an, China, Jul. 10-13, 2017

    Approximate Bayesian Computation method for calibrating the Propagation Graph model using Summaries

    No full text
    This code is for learning the parameters of the polarimetric propagation graph model from summaries called temporal moments using approximate Bayesian computation

    Deep Learning method for calibrating the polarimetric Propagation graph model

    No full text
    This code is for learning the parameters of the polarimetric propagation graph model from summaries called temporal moments using a deep neural network

    Datasæt for Model-free detection of cyberattacks on voltage control in distribution grids

    No full text
    Dette datasæt indeholder tidsserier for tre noder i et simuleret elektrisk grid, hvor der befinder sig solceller, samt reguleringssignal output fra en controller til disse solceller

    A Dataset for Buffering Delays Due to the Interaction Between the Nagle Algorithm and the Delayed Acknowledgement Algorithm in Cyber-Physical Systems Communication

    No full text
    Here, we provide the research community with a data set for the buffering delays that data packets experience at TCP sending side in the realm of Cyber-Physical Systems (CPSs). We focus on the buffering that occurs at the sender side due to the the adverse interaction between the Nagle algorithm and the delayed acknowledgement algorithm, which both were originally introduced into TCP to prevent sending out many small-sized packets over the network. The data set is collected using four real-life operating systems: Windows, Linux, FreeBSD, and QNX (a real-time operating system). In each scenario, there are three separate different (virtual) machines running various operating systems. One machine, or an end-host, acts a data source, another acts as a data sink, and a third acts a network emulator that introduces artificial propagation delays between the source and the destination. To measure buffering delay at the sender side, we record for each sent packet the two time instants: when the packet is first generated at the application layer, and when it is actually sent on the physical network. In each case, 10 different independent experiment replications/runs are executed. Here, we provide the full distribution of all delay samples represented by the cumulative distribution function (CDF). The data exhibited here gives an impression of the amount and scale of the delay occurring at sender-side in TCP. More importantly, the data can be used investigate to what degree these delays affect the performance of cyber-physical systems or other real-time applications employing TCP
    corecore