Newer Entries
bnlearn (3.0)
* added dsep() to test d-separation.
* implemented Tree-Augmented Naive Bayes (TAN) in tree.bayes().
* implemented Semi-Interleaved HITON-PC in si.hiton.pc().
* improved parsing in read.{bif,dsc,net}().
* fixed an indexing error in Gaussian permutation tests.
* added a "textCol" highlight option to graphviz.plot().
* added {incoming,outgoing,incident}.arcs().
* fixed two hard-to hit bugs in TABU search (thanks Maxime Gasse).
* added custom.fit() for expert parameter specification.
* added support for Markov blanket preseeding in learn.mb().
* assume independence instead of returning an error when a
partial correlation test fails due to errors in the
computation of the pseudoinverse of the covariance matrix.
* fixed an incorrect optimization in the backward phase of mmpc().
* fixed a floating point rounding problem in the mutual
information tests (64bit OSes only, thanks Maxime Gasse).
* fixed cp{query,dist}() handling of TRUE (thanks Maxime Gasse).
* added learn.nbr() to complement learn.mb().
bnlearn (2.9)
* integrate with the graph package and provide import/export
functions for the graphNEL and graphAM classes.
* removed bn.var.test() and the "aict" test.
* fixed wrong ISS handling in the BDe score.
* fixed a buffer overflow in the BGe score.
* added subgraph(), tiers2blacklist(), and cextend().
* fixed bug in boot.strength(), which failed with a spurious
error when a PDAG was learned.
* support interval discretization as the initial step in
Hartemink's discretization.
* fixed blacklist handling in chow.liu().
* fixed choose.direction() and arc.strength(), both of them
require a .test.counter and should create it as needed.
* added as.bn() and as.bn.fit() for "grain" objects (from the
the gRain package) and as.grain() for "bn.fit" objects.
* fixed infinte loop in choose.direction().
* make choose.direction(..., criterion = "bootstrap") work again.
* added an 'every' argument to random.graph() for the 'ic-dag'
and 'melancon' algorithms.
* shortened the optional arguments for random.graph(...,
method = "hartemink") to "idisc" and "ibreaks".
bnlearn (2.8)
* switched "cpdag" to TRUE in {boot,custom}.strength().
* added a "weights" argument to custom.strength().
* implemented the modified BDe score handling mixtures of
experimental and observational data (mbde).
* reimplemented the BGe score for speed (about 20x faster).
* fixed a buffer overflow in predict() for discrete networks.
* fixed sanitization in predict() for bn.fit.{d,g}node objects.
* handle partially directed graphs in bn.cv() for log-likelihood
loss (both discrete and Gaussian).
bnlearn (2.7)
* make .onLoad() more robust, so that it passes "R CMD check"
even with a broken Graphviz installation.
* reduced memory usage in graphviz.plot(), now using arc lists
instead of adjacency matrices.
* added tie breaking in prediction.
* allow inter.iamb() to break infinite loops instead of returning
an error.
* fixed a buffer overflow in discrete Monte Carlo tests.
* added sequential Monte Carlo permutation tests.
* improved performance and error handling in Gaussian Monte
Carlo tests.
bnlearn (2.6)
* allow discrete data in Hartemink's discretization algorithm.
* implemented {read,write}.bif() to import/export BIF files.
* implemented {read,write}.dsc() to import/export DSC files.
* implemented {read,write}.net() to import/export NET files.
* completely reimplemented compare() to return useful metrics,
it was just a slower version of all.equal().
* implemented model averaging with significance thresholding
in averaged.network().
* use new significance thresholding in {boot,custom}.strength()
and arc.strength(criterion = "bootstrap").
* export predicted values in bn.cv() when using classification
error.
* fixed an integer overflow in mutual information tests.
bnlearn (2.5)
* reimplemented rbn.discrete() in C both for speed and to
get CPT indexing right this time.
* added bn.net() to complement bn.fit().
* changed the default value of the imaginary sample size to
10, which is a de facto standard in literature.
* implemented the ARACNE and Chow-Liu learning algorithms.
* improved robustness of correlation estimation.
* added a "cpdag" option to boot.strength().
* fixed bug in discretize().
* improved sanitization in graphviz.plot() and strength.plot().
* added Hamming distance.
bnlearn (2.4)
* reimplemented naive Bayes prediction in C for speed.
* added some debugging output to predict() methods.
* fixed printing of fitted Gaussian BNs.
* fixed stack imbalance in Gaussian Monte Carlo tests.
* implemented some discretization methods in discretize().
* added custom.strength() for arbitrary sets of networks.
* fixed strength.plot() threshold for significant arcs.
bnlearn (2.3)
* added cpdist() to generate observations from arbitrary
conditional probability distributions.
* added a simple naive.bayes() implementation for discrete
networks, complete with a predict() implementation using
maximum posterior probability.
* added the shrinkage test for the Gaussian mutual information.
* added ntests(), in.degree(), out.degree(), degree(),
whitelist() and blacklist().
* added support for the snow package in bn.boot(), bn.cv(),
cpquery() and cpdist().
* fixed integer overflow in the computation of the number of
parameters of discrete networks.
* fixed errors in the labels of Gaussian scores.
bnlearn (2.2)
* fixed a bug in moral(), which returned duplicated arcs when
shielded parents were present in the graph.
* implemented all.equal() for bn objects.
* added workaround for plotting empty graphs with graphviz.plot(),
which previously generated an error in Rgraphviz.
* added a CITATION file.
* added the narcs() function.
* print.bn()'s now supports small(er) line widths.
* added support for "bn.fit" objects in graphviz.plot().
* added support for changing the line type of arcs in
graphviz.plot().
* added the learn.mb() function to learn the Markov blanket of
a single variable.
* fixed calls to UseMethod(), which were not always working
correctly because of the changed parameter matching.
* fixed an off-by-one error in the prediction for discrete
root nodes.
bnlearn (2.1)
* optimized data frame subsetting and parents' configurations
construction in conditional.test() and score.delta().
* fixed and improved the performance of rbn().
* fixed wrong penalization coefficient for Gaussian AIC (was
computing a Gaussian BIC instead).
* added cpquery() to perform conditional probability queries
via Logic (Rejection) Sampling.
* added bn.cv() to perform k-fold cross-validation, with
expected log-likelihood and classification error as
loss functions.
* added predict(), logLik() and AIC() methods for bn.fit
objects.
* renamed bnboot() to bn.boot() for consistency with bn.cv()
and bn.fit().
bnlearn (2.0)
* added the shd() distance.
* renamed dag2ug() to skeleton(), which is more intuitive.
* added support for "bn.fit" objects in rbn().
* added vstructs() and moral().
* added the coronary data set.
* improved partial correlation resillience to floating point
errors when dealing with ill-behaved covariance matrices.
* miscellaneous (small) optimizations in both R and C code.
bnlearn (1.9)
* added support for "bn.fit" objects in nodes(), nbr(),
parents(), children(), root.nodes(), leaf.nodes(),
arcs(), directed.arcs(), undirected.arcs(), amat(),
nparams(), mb(), path(), directed(), acyclic(),
node.ordering().
* fixed bug in hybrid and score-based learning algorithms,
which did not handle blacklists correctly.
bnlearn (1.8)
* removed the fast mutual information test in favour of
the equivalent shrinkage test, which uses a more
systematic approach.
* fixed fast.iamb(), which should not have truncated
exact and Monte Carlo tests.
* added the HailFinder and Insurance data sets.
* updated the Grow-Shrink implementation according to
newer (and much clearer) literature from Margaritis.
* rewritten more of the configuration() function in C,
resulting in dramatic (2x to 3x) speedups for large
data sets.
* implemented tabu search.
* removed rshc() in favour of rsmax2(), a general two-stage
restricted maximization hybrid learning algorithm.
* reimplemented cpdag() in C, with an eye towards a
future integration with constraints-based algorithms.
* fixed a bug in coef() for discrete bn.fit objects.
* implemented Melancon's uniform probability random DAG
generation algorithm.
bnlearn (1.7)
* big clean-up of C code, with some small optimizations.
* fixed bug in the handling of upper triangular matrices
(UPTRI3 macro in C code).
* added the dag2ug() and pdag2dag() functions.
* fixed a bug in bn.fit(), now it really works even for
discrete data.
* added bn.moments(), bn.var() and bn.var.test() for
basic probabilistic modelling of network structures.
bnlearn (1.6)
* implemented the mmhc() algorithm and its generic
template rshc().
* rewritten both the optimized and the standard implementation
of hc() in C, they are way faster than before.
* various fixes to documentation and bibtex references.
* revised the functions implementing the second half
of the constraint-based algorithm.
* improved parameter sanitization in "amat<-"().
* fixed the functions that set arcs' direction in
constraint-based algorithms.
bnlearn (1.5)
* improved parameter sanitization in the "<-"()
functions and modelstring().
* added support for bootstrap inference with bnboot(),
boot.strength(), arc.strength(, criterion = "bootstrap")
and choose.direction(, criterion = "bootstrap").
* fixed a bug in acyclic() causing false negatives.
* added bn.fit() for estimating the parameters of a Bayesian
network conditional on its structure.
* mapped some S3 methods (print, fitted, fitted.values,
residuals, resid, coefs, coefficients) to objects of
class "bn.fit", "bn.fit.gnode" and "bn.fit.dnode".
* added some plots for the fitted models based on the
lattice package.
* implemented AIC and BIC for continuous data, and
removed the likelihood score.
* various optimizations to C code.
* throughout documentation update.
* fixed an infinite loop bug in inter.iamb().
bnlearn (1.4)
* exported the "B" parameter to specify the number of
permutations to be done in a permutation test.
* removed the "direction" parameter from constraint-based
learning algorithms, as it was non-standard,
misnamed and had various reported problems.
* removed the duplicate "dir" label for the BDe score.
* added support for Gaussian data to rbn() and nparams().
* added "modelstring<-"().
* revised references in documentation.
* added the alarm and marks data sets.
* moved the scripts to generate data from the networks
included as data sets to the "network.scripts"
directory.
bnlearn (1.3)
* added Monte Carlo permutation tests for mutual
information (for both discrete and Gaussian data),
Pearson's X^2, linear correlation and Fisher's Z.
* added strength.plot().
* reimplemented random.graph() in C for speed.
* clean up of C memory allocation functions.
bnlearn (1.2)
* added cache.partial.structure() to selectively
update nodes' cached information stored in
'bn' objects.
* fixed a bug in cache.structure().
* reimplemented is.acyclic() in C to fix a bug
causing false negatives.
* added the lizards data set.
bnlearn (1.1)
* implemented mmpc().
* slightly changed gaussian.test to be more learning-friendly.
* fixed bugs in empty.graph() and "arcs<-"().
* changed the default probability of arc inclusion for
the "ordered" method in random.graph() to get sparser
graphs.
* added graphviz.plot().
* implemented the possibility of not learning arc directions
in constraint-based algorithms.
* changed the default value of the strict parameter
from TRUE to FALSE.
* reimplemented cache.structure() in C to increase
random.graph() performance and scalability.
bnlearn (1.0)
* completely rewritten random.graph(); now it supports
different generation algorithms with custom tuning
parameters.
* moved to dynamic memory allocation in C routines.
* improved performance and error handling of rbn().
bnlearn (0.9)
* reimplemented all the functions that deal with cycles
and paths in C, which increased their speed manifold
and greatly improved their memory use.
* cycle detection and elimination snow parallelized in
gs(), iamb(), fast.iamb() and inter.iamb().
* renamed {root,leaf}nodes() to {root,leaf}.nodes().
* rewritten node ordering in C to improve performance
and avoid recursion.
* added ci.test(), which provides a fronted to all the
independence and conditional independence tests
implemented in the package.
* added mutual information (for Gaussian data) and
Pearson's X^2 tests (for discrete data).
* removed the Mantel-Haenszel test.
bnlearn (0.8)
* added support for random restarts in hc().
* added arc.strength(), with support for both conditional
independence tests and network scores.
* added the asia data set.
* lots of documentation updates.
* reimplemented functions related to undirected arcs in C
for speed.
* added more parameter sanitization.
bnlearn (0.7)
* optimized hc() via score caching, score equivalence,
and partial reimplementation in C.
* many utility functions' backends reimplemented in C
for speed.
* improved cycle and path detection.
* lots of documentation updates.
* added more parameter sanitization.
bnlearn (0.6)
* implemented hc().
* added support for the K2 score for discrete networks.
* ported Gaussian posterior density from the deal package.
* added the gaussian.test data set.
* added an AIC-based test for discrete data.
* lots of documentation updates.
* added more parameter sanitization.
bnlearn (0.5)
* added more utility functions, such as rootnodes(),
leafnodes(), acyclic(), empty.graph() and random.graph().
* reimplemented parents' configuration generation in C
for speed.
* lots of documentation updates.
* added lots of parameter sanitization in utils-sanitization.R.
bnlearn (0.4)
* added rbn(), with support for discrete data.
* added a score function, with support for likelihood,
log-likelihood, AIC, BIC, and posterior Dirichlet
density of discrete networks.
* ported modelstring(), a string representation of a network,
from package deal.
* added many utility functions, such as parents() and children()
and their counterparts "parents<-"() and "children<-"().
* lots of documentation updates.
bnlearn (0.3)
* added support for the snow package in gs(), iamb(), inter.iamb()
and fast.iamb().
* added the learning.test data set.
* reimplemented mutual information in C for speed.
* lots of documentation updates.
bnlearn (0.2)
* implemented iamb(), inter.iamb() and fast.iamb().
* added partial correlation and Fisher's Z conditional
independence tests for Gaussian data.
* first completely documented release.
bnlearn (0.1)
* initial release.
* preliminary implementation of gs() with mutual information
as conditional independence test.