# Encryptiondecription using the windows cryptoapi

The following C project contains the C source code and C examples used for encryption/decription using the windows cryptoapi. This class provides encryption/decryption through the CryptoAPI.

The following Matlab project contains the source code and Matlab examples used for shannon and non extensive entropy.
The functions include extensive Shannon and nonextensive Tsallis,escort Tsallis,and renyi entropy.

The following Matlab project contains the source code and Matlab examples used for cross entropy tsp solver.
TSP toy problem solved by Cross-Entropy Method (stochastic optimization procedure) based on generation of multiple random paths.

The following Matlab project contains the source code and Matlab examples used for entropy, joint entropy and conditional entropy function for n variables.
for entropy
H = entropy(S)
this command will evaluate the entropy of S, S should be row matrix
H = entropy([X;Y;Z])
this command will find the joint entropy for the 3 variables
H = entropy([X,Y],[Z,W])
this will find H(X,Y/Z,W).

The following Matlab project contains the source code and Matlab examples used for two qubit quntum computer.
In this program, we examine the entropy of two qubit system, which has the phonon entropy. the decohence rate and entropy has been examined.

The following Matlab project contains the source code and Matlab examples used for normalized mutual entropy.
Normalized mutual entropy provides a measure of the diversity of a two dimensional matrix.

The following Matlab project contains the source code and Matlab examples used for approximate entropy.
Approximate Entropy is a measure of complexity.

The following Matlab project contains the source code and Matlab examples used for multivariate gaussian mixture model optimization by cross entropy.
Fit a multivariate gaussian mixture by a cross-entropy method.

The following Matlab project contains the source code and Matlab examples used for calculates the sample entropy, in bits, of discrete variables. .
Entropy: Returns entropy (in bits) of each column of 'X'
by Will Dwinnell
H = Entropy(X)
H = row vector of calculated entropies (in bits)
X = data to be analyzed
Note 1: Each distinct value in X is considered a unique value.

The following Matlab project contains the source code and Matlab examples used for conditional entropy.
ConditionalEntropy: Calculates conditional entropy (in bits) of Y, given X
H = ConditionalEntropy(Y,X)
H = calculated entropy of Y, given X (in bits)
Y = dependent variable (column vector)
X = independent variable(s)
Note 1: Each distinct value is considered a unique symbol.

The following Matlab project contains the source code and Matlab examples used for joint entropy.
JointEntropy: Returns joint entropy (in bits) of each column of 'X'
Note: Each distinct value is considered a unique symbol.
H = JointEntropy(X)
H = calculated joint entropy (in bits)
X = data to be analyzed

The following Matlab project contains the source code and Matlab examples used for thresholding the minimum cross entropy.
The method of minimum cross entropy chooses "the best threshold" which loses less information during the thresholding.
The principle is to calculate the disance D between two distributions P and Q.
D(P,Q)=sum pi log(pi/qi)

The following Matlab project contains the source code and Matlab examples used for thresholding the maximum entropy.
Maximum entropy thresholding is based on the maximization of the information measure between object and background.

The following Matlab project contains the source code and Matlab examples used for information theory toolbox.
This toolbox contains functions for discrete random variables to compute following quantities:
1)Entropy
2)Joint entropy
3)Conditional entropy
4)Relative entropy (KL divergence)
5)Mutual information
6)Normalized mutual information
7)Normalized variation information
This toolbox is a tweaked and bundled version of my previous submissions.

The following Matlab project contains the source code and Matlab examples used for image entropy.
the entropy of an image shows its the quality objectively.

The following Matlab project contains the source code and Matlab examples used for entropy calculator.
In information theory, entropy is a measure of the uncertainty associated
with a random variable.

The following Matlab project contains the source code and Matlab examples used for fast approximate entropy.
Approximate Entropy (ApEn) is a popular tool in analysing the complexity of time series data especially in clinical research.

The following Matlab project contains the source code and Matlab examples used for sample entropy.
SampEn is a measure of complexity that can be easily applied to any type of time series data, including physiological data such as heart rate variability and EEG data.

The following Matlab project contains the source code and Matlab examples used for entropy estimation by kozachenko leonenko method.
The script calculates the entropy point estimation for 1D date by the Kozachenko-Leonenko method.

The following Matlab project contains the source code and Matlab examples used for entropy estimation from histogram.
The script calculates the entropy point estimation from 1D histogram of data.

The following Matlab project contains the source code and Matlab examples used for mutual information and joint entropy.
Compute mutual information and joint entropy of two images. For implementing this function also download function joint_histogram.m in my File Exchange.

The following Matlab project contains the source code and Matlab examples used for compute the entropy of an entered text string.
To compute Entropy the frequency of occurrence of each character must be found out.

The following Matlab project contains the source code and Matlab examples used for von mises image entropy.
The local Rényi entropy of a given image is calculated in four equally spaced orientations and used to determine the parameters of the von Mises distribution of the image entropy.

The following Matlab project contains the source code and Matlab examples used for permutation entropy.
% Calculate the permutation entropy
% Input: y: time series;
% m: order of permuation entropy
% t: delay time of permuation entropy,
% Output:
% pe: permuation entropy
% hist: the histogram for the order distribution
%Ref: G Ouyang, J Li, X Liu, X Li, Dynamic Characteristics of Absence EEG Recordings with Multiscale Permutation %
% Entropy Analysis, Epilepsy Research, doi: 10.

The following Matlab project contains the source code and Matlab examples used for multiscale permutation entropy (mpe).
% Calculate the Multiscale Permutation Entropy (MPE)
% Input: X: time series;
% m: order of permuation entropy
% t: delay time of permuation entropy,
% Scale: the scale factor
% Output:
% MPE: multiscale permuation entropy

The following Matlab project contains the source code and Matlab examples used for modified minimum cross entropy threshold selection.
% MinCEP is a function for thresholding using Minimum Cross Entropy
% threshold selection of non-blank space of Image.

The following Matlab project contains the source code and Matlab examples used for ar filter + minimum entropy deconvolution for bearing fault diagnosis.
This function AR_MED_FILTER takes input SIGNAL with Sampling Frequency, Fs, and applies the Yule Walker method based AR filter.

The following Matlab project contains the source code and Matlab examples used for ecological information based approach.
This is a simple code for calculating the number of nodes, Total System Throughput (TST), Average Mutual Information (AMI), conditional entropy, effective connectivity, and effective number of roles of information flow based network represented in the form of a matrix.

The following Matlab project contains the source code and Matlab examples used for fuzzy entropy and mutual information.
Nowadays there are heaps of articles on the theory of fuzzy entropy and fuzzy mutual information.

The following Matlab project contains the source code and Matlab examples used for fast mutual information, joint entropy, and joint histogram calculation for n d images.
This uses no for-loops - only index manipulation.