bnlearn (5.0)

  * completed the implementation of KL(), which now supports conditional 
     Gaussian networks in addition to discrete and Gaussian ones.
  * implemented Shannon's entropy.
  * conditional independence tests are now have optional arguments like
     network scores.
  * added a "custom-test" conditional independence test allowing user-provided
     test statistics in the same way as "custom" allows user-provided network
     scores.
  * added a "params.threshold" to hard EM methods in bn.fit(), and renamed
     the log-likelihood threshold to "loglik.threshold".
  * the log-likelihood stopping rule in hard EM now uses the log-likelihood of
     the completed data, which works better in the presence of latent
     variables and is more appropriate accoridng to Koller & Friedman (thanks
     Laura Azzimonti).

bnlearn (4.9.1)

  * assorted fixes to the Rprintf() format strings to pass the CRAN tests.
  * the default node shape in graphviz.plot(), strength.plot() and
     graphviz.compare() is now "rectangle", which is more space-efficient for
     typical node labels.
  * graphviz.compare() now accepts bn.fit objects, converting them to the
     corresponding bn objects to compare the respective network structures.
  * fixed a segfault in ci.test(), triggered by setting the conditioning 
     variable set to a zero-column matrix (thanks Qingyuan Zheng).

bnlearn (4.9)

  * as.prediction() is now deprecated and will be removed by the end of 2024.
  * graphviz.plot(), strength.plot() and graphviz.compare() now have a
     "fontsize" argument that controls the font size of the node labels.
  * removed the rbn() method for bn objects.
  * predict() and impute() can now use exact inference with method = "exact" for
     discrete and Gaussian networks.
  * the "custom" score now accepts incomplete data.
  * it is now possible to use the "custom" score to implement custom Bayesian
     scores in BF() and bf.strength().
  * fixed the conditional probabilities computed by cpquery(), which now 
     disregards particles for which either the evidence or the event
     expressions evaluate to NA (thanks Simon Rauch).
  * added a complete.graph() function to complement empty.graph().
  * removed the "empty" method of random.graph(), use empty.graph() instead.
  * structural.em() can now use exact inference in the expectation step with 
      impute = "exact".
  * structural.em() can now be called from bn.boot(), boot.strength() and 
      bn.cv().
  * updated as.bn.fit() to work with the latest gRain release.
  * fixed segfault in tree.bayes() with illegal whitelists and blacklists.
  * predict(method = "bayes-lw") and predict(method = "exact") now work even
     when from = character(0).
  * implemented hard EM in bn.fit() with method = "hard-em" (discrete BNs),
     method = "hard-em-g" (Gaussian BNs) and method = "hard-em-cg"
     (conditional Gaussian BNs).
  * predict() and impute() can use clusters from the parallel package with all 
     available methods.
  * hard EM methods in bn.fit() can use clusters from the parallel package
     like previously available parameter estimators.
  * boot.strength() now shuffles the columns of the data by default, which seems
     to broadly improve structural accuracy.
  * constraint-based algorithms are now guaranteed to return a CPDAG; this was
     not the case previously because shielded colliders were preserved along
     with unshielded ones (thanks Ruben Camilo Wisskott).
  * bn.fit() now works with network classifiers (thanks Riley Mulhern).
  * logLik() has been re-implemented and now accepts incomplete data.
  * tabu search now works with continuous data containing latent variables
     (thanks David Purves).
  * read.net() can now parse interval nodes (thanks Marco Valtorta).
  * graphviz.chart() now handles text.col correctly even when it contains a 
     separate colour for each node.
  * impute() now produces an error instead of returning data still containing
     missing values (with "strict" set to TRUE, the default) or at least it
     produces a warning (with "strict" set to FALSE).
  * better sanitization of CPTs in custom.fit() (thanks Dave Costello).
  * implemented the node-average (penalized) likelihood scores from Bodewes
     and Scutari for discrete ("nal" and "pnal"), Gaussian ("nal-g" and
     "pnal-g") and conditional Gaussian ("nal-cg" and "pnal-cg") BNs.
  * predict() for bn.fit objects now accepts incomplete data, conditioning on
     the observed values and averaging over the missing values in each
     observations in the case of method = "bayes-lw" and method = "exact".

bnlearn (4.8.1)

  * assorted fixes to the C code to pass the CRAN tests.

bnlearn (4.8)

  * the rbn() method for bn objects is now deprecated and will be removed by the
     end of 2023.
  * removed choose.direction().
  * implemented gbn2mvnorm(), which converts a Gaussian BN to its multivariate
     normal global distribution, and mvnorm2gbn(), which does the opposite.
  * added the extended BIC from Foygel and Drton.
  * the maximum likelihood estimators in bn.fit() now have distinct labels "mle"
     (discrete BNs), "mle-g" (Gaussian BNs) and "mle-cg" (conditional Gaussian
     BNs) to identify them without ambiguity.
  * graphviz.chart() now supports Gaussian and conditional Gaussian BNs
     (thanks Tom Waddell).
  * the "draw.levels" argument of graphviz.chart() has been renamed to
     "draw.labels".
  * chow.liu(), aracne() and tree.bayes() now handle data with missing values.
  * implemented the hierarchical Dirichlet parameter estimator for related
     data sets from Azzimonti, Corani and Zaffalon.
  * all unidentifiable parameters are now NAs, including those that were NaNs
     before, for consistency across node types.
  * structural.em() now returns descriptive error messages when the data
     contain latent variables (thanks Bernard Liew).
  * implemented the Kullback-Leibler divergence for discrete and Gaussian
     networks.
  * fixed spurious errors in bn.boot() resulting from changes in the
     all.equal() method for functions in R 4.1.0 (thanks Fred Gruber).
  * added a mean() method that averages bn.fit objects with the same network
     structure, with optional weights.
  * bn.boot(), bn.cv() and boot.strength() now handle data with missing values.

bnlearn (4.7)

  * removed the "moral" argument from vstructs() and cpdag().
  * removed the path() alias to path.exists().
  * the "nodes" argument has been removed from averaged.network(), it was not
     meaningfully used anywhere.
  * the choose.direction() function is now deprecated, and it will also be
     removed by the end of 2022.
  * faster sanitization of bn objects (thanks David Quesada).
  * fixed an overflow in the BGe score that produced NaN values (thanks David
     Quesada).
  * Date and POSIXct objects are not valid inputs for functions in bnlearn
     (thanks David Purves).
  * export the function computing the significance threshold in
     averaged.network() as inclusion.threshold() (thanks Noriaki Sato).
  * fixed the sanitization of custom cutpoints in strength.plot().
  * reimplemented discretize() in C for speed, Hartemink's discretization is
     faster by a factor of at least 2x.
  * discretize() now handles data with missing values.
  * merged an implementation of the factorized NML and the quotient NML scores
     from Tomi Silander.

bnlearn (4.6.1)

  * Fixed out-of-bounds memory access in discretize() (thanks Brian Ripley).

bnlearn (4.6)

  * removed support for parametric bootstrap in bn.boot().
  * path() has been renamed path.exists(); path() will be kept as an alias
     until 2021 when it will be removed to avoid clashing with BiocGenerics.
  * the "moral" arguments of vstructs() and cpdag() are now deprecated and
     it will be removed in 2021.
  * fixed graphviz.chart(), which called plot.new() unnecessarily and created
     empty figures when a graphical device such as pdf() was already open.
  * added a "custom" (decomposable) score that takes a user-specified R
     function to compute the score of local distributions in score() and
     structure learning algorithms (thanks Laura Azzimonti).
  * fixed spouses(), which now always returns a character vector.
  * added an "including.evidence" argument to the as.bn.fit() method for grain
     objects to carry over hard evidence in the conversion (thanks Rafal
     Urbaniak).
  * bn.fit() with missing data is now faster by 2x-3x.
  * due to API changes, bnlearn now suggests gRain >= 1.3-3.
  * fixed permutation tests, which incorrectly used a strict inequality when
     computing the fraction of test statistics larger than that computed from
     the original data (thanks David Purves).
  * make cpdag() and vstructs() agree for both moral = FALSE and moral = TRUE
     (thanks Bingling Wang).
  * implemented colliders(), shielded.colliders() and unshielded.colliders();
     vstructs() is now an alias of unshielded.colliders().
  * added functions to import and export igraph objects.
  * fixed pc.stable(), which failed on two-variables data sets.
  * added utility functions set2blacklist(), add.node(), remove.node(), 
     rename.nodes().
  * fixed h2pc(), which failed when encountering isolated nodes (thanks Kunal
     Dang). 
  * better argument sanitization for threshold and cutpoints in strength.plot().
  * fixed "newdata" argument sanitization for the pred-loglik-* scores.
  * read.net() now disregards experience tables instead of generating an
     error when importing NET files from Hugin (thanks Jirka Vomlel).
  * fixed bug in mmpc(), which did return wrong maximum/minimum p-values
     (thanks Jireh Huang).

bnlearn (4.5)

  * the "parametric" option for the "sim" argument of bn.boot() is now
     deprecated; the argument will be removed in 2020.
  * removed the relevant() function, and the "strict" and "optimized" arguments
     of constraint-based structure learning algorithms.
  * save arc strengths as weights in the graph object returned by
     strength.plot() (thanks Fabio Gori).
  * information about illegal arcs in now preserved in averaged.network(), so
     that cpdag() works correctly on the returned network.
  * loss function "pred" (classification error, predicted values from parents)
     is now distinct from "pred-exact" (classification error, exact posterior
     predicted values for classifiers); and it is now possible to use "pred" and
     "pred-lw" in bn.cv() when the model is a BN classifier (thanks Kostas 
     Oikonomou).
  * graphviz.compare() now returns a list containing the graph objects
     corresponding to the networks provided as arguments (thanks William Raynor).
  * the "from-first" method in graphviz.compare() now has a "show.first"
     argument that controls whether the reference network is plotted at all
     (thanks William Raynor).
  * implemented the IAMB-FDR, HPC and H2PC structure learning algorithms.
  * reimplemented the BGe score using the updated unbiased estimator from
     Kuipers, Moffa and Heckerman (2014).
  * fixed the test counter in constraint-based algorithms, which would overcount
     in some cases.
  * it is now possible to use any structure learning algorithm in bn.boot() and
     bn.cv().
  * fixed prediction from parents for conditional Gaussian nodes with no
     continuous parents (thanks Harsha Kokel).
  * it is now possible to use data with missing values in learn.mb(),
     learn.nbr() and in all constraint-based structure learning algorithms.
  * fixed tabu() in the presence of zero-variance continuous variables; the
     search was not correctly initialized because the starting model is
     singular (thanks Luise Gootjes-Dreesbach).
  * implemented predictive log-likelihood scores for discrete, Gaussian and
     conditional Gaussian networks.
  * fixed an integer overflow in the nparams() method for bn.fit objects
     (thanks Yujian Liu).
  * make conditional sampling faster for large conditional probability tables
      (thanks Yujian Liu).
  * preserve structure learning information in bn.cv(), so that
     custom.strength() can get directions right from the resulting set of 
     networks (thanks Xiang Liu).
  * revised the preprocessing of whitelists and blacklists, and clarified the
     documentation (thanks Michail Tsagris).
  * added a "for.parents" argument to coef() and sigma() to make them return
     the parameters associated with a specific configuration of the discrete
     parents of a node in a bn.fit object (thanks Harsha Kokel).
  * fixed segfault in predict(..., method = "bayes-lw") from data that contain
     extra variables that are not in the network (thanks Oliver Perkins).

bnlearn (4.4)

  * fixed pc.stable() v-structure detection in the presence of blacklisted arcs.
  * warn about trying to cextend() networks that contain no information about
     arc directions (and thus v-structures), such as those learned with
     "undirected = TRUE" or those returned by skeleton().
  * fixed a bug because of which a number of functions incorrectly reported
     that data had variables with no observed values when that was not true.
  * fixed posterior imputation from a single observed variable (thanks Derek
     Powell).
  * added an argument "max.sx" to limit the maximum allowed size of the
     conditioning sets in the conditional independence tests used in
     constraint-based algorithms and in learn.{mb,nbr}().
  * do not generate an error when it is impossible to compute a partial
     correlation because the covariance matrix cannot be (pseudo)inverted;
     generate a warning and return a zero partial correlation instead.
  * added an as.lm() function to convert Gaussian networks and nodes to (lists
     of) lm objects (thanks William Arnost).
  * fixed the penalty terms of BIC and AIC, which did not count residual
     standard errors when tallying the parameters of Gaussian and conditional
     Gaussian nodes.
  * cpdag() failed to set the directions of some compelled arcs when both
     end-nodes have parents (thanks Topi Talvitie).
  * custom.strength() now accepts bn.fit objects in addition to bn objects
     and arc sets.
  * vstructs() mistakenly handled moral = TRUE as if it were moral = FALSE
     (thanks Christian Schuhegger).
  * graphviz.plot() and strength.plot() now have a "render" argument that
     controls whether a figure is produced (a graph object is always returned
     from both functions).
  * graphviz.plot(), strength.plot() and graphviz.compare() now have a "groups"
     argument that specifies subsets of nodes that should be plotted close to
     each other, layout permitting.
  * fixed tree.bayes() for data frames with 2-3 variables, and chow.liu() as
     well (thanks Kostas Oikonomou).

bnlearn (4.3)

  * the "strict" and "optimized" arguments of constraint-based algorithms are
     now deprecated and will be removed at the beginning of 2019.
  * the relevant() function is now deprecated, and it will also be removed
     at the beginning of 2019.
  * improved and fixed a few bugs in the functions that import and export
     bn and bn.fit objects to the graph package.
  * fixed a bug in averaged.network(), which could result in inconsistent bn
     objects when arcs were dropped to obtain an acyclic graph (thanks Shuonan
     Chen).
  * added a graphviz.chart() function to produce DAG-with-barchart-nodes plots.
  * fixed the counting of the number of parameters of continuous and hybrid
     networks, which did not take the residual standard errors into account
     (thanks Jeffrey Hart).
  * improved handling of singular models in impute().
  * added an import function for pcAlgo objects from pcalg.
  * fixed bug in the sanitization of conditional Gaussian networks (thanks
     Kostas Oikonomou).
  * added a loss() function to extract the estimated loss values from the
     objects returned by bn.cv() (thanks Dejan Neskovic).
  * it is now possible to use data with missing values in bn.fit() and
     nparams().
  * added a "replace.unidentifiable" argument to bn.fit(..., method = "mle"),
     to replace parameter estimates that are NA/NaN with zeroes (for
     regression coefficients) and uniform probabilities (in conditional
     probability tables).
  * added a bf.strength() function to compute arc strengths using Bayes
     factors.
  * learn.{mb,nbr}() now work even if all nodes are blacklisted.
  * assigning singular models from lm() to nodes in a bn.fit object will now
     zapsmall() near-zero coefficients, standard errors and residuals to match
     the estimates produced by bn.fit().
  * bn.cv() now supports performing multiple runs with custom folds (different
     for each run).
  * improved sanitization in mutilated(), and updated its documentation.
  * removed the bibTeX file with references, available at www.bnlearn.com.
  * implemented the stable version of the PC algorithm.
  * added a count.graph() function that implements a number of graph enumeration
     results useful for studying graphical priors.
  * fixed loss estimation in bn.cv() for non-extendable partially directed
     graphs, now errors are produced instead of returning meaningless results
     (thanks Derek Powell).

bnlearn (4.2)

  * added a tuning parameter for the inclusion probability to the marginal
     uniform graph prior.
  * added a Bayesian Dirichlet score using Jeffrey's prior (from Joe Suzuki).
  * allow fractional imaginary sample sizes for posterior scores.
  * allow imaginary sample sizes in (0, 1] for discrete posterior scores,
     to explore asymptotic results.
  * set the default imaginary sample size for discrete networks to 1, following
     recommendations from the literature.
  * moral(), cpdag(), skeleton() and vstructs() now accept bn.fit objects in
     addition to bn objects.
  * fixed a segfault in cpdist(..., method = "lw") caused by all weights
     being equal to NaN (thanks David Chen).
  * changed the default value of the "optimized" argument to "FALSE" in
     constraint-based algorithms.
  * changed the arguments of mmhc() and rsmax2() to improve their flexibility
     and to allow separate "optimized" values for the restrict and maximize
     phases.
  * fixed sanitization of fitted networks containing ordinal discrete
     variables (thanks David Chen).
  * improved argument sanitization in custom.fit() and model string functions.
  * added a BF() function to compute Bayes factors.
  * added a graphviz.compare() function to visually compare network structures.
  * implemented the locally averaged Bayesian Dirichlet score.
  * custom.strength() now accepts bn.kcv and bn.kcv objects and computes arc
     strengths from the networks learned by bn.cv() in the context of
     cross-validation.
  * fixed multiple bugs in cextend() and cpdag() that could result in the
     creation of additional v-structures.
  * implemented the Structural EM algorithm in structural.em().
  * fixed multiple bugs triggered by missing values in predict() (thanks
     Oussama Bouldjedri).
  * implemented an as.prediction() function that exports objects of class
     bn.strength to the ROCR package (contributed by Robert Ness).

bnlearn (4.1)

  * fixed memory corruption in dsep() (thanks Dominik Muller).
  * added the marginal uniform prior.
  * fixed the optimized score cache for the Castelo & Siebes and for the
     marginal uniform priors, which were affected by several subtle bugs.
  * bn.cv() now implements a "custom-folds" method that allows to manually
     specify which observation belongs to each fold, and folds are not
     constrained to have the same size.
  * fixed checks in the C code involving R objects' classes; they failed
     when additional, optional classes were present (thanks Claudia Vitolo).
  * fixed cpdag() handling of illegal arcs that are part of shielded
     colliders (thanks Vladimir Manewitsch).
  * removed misleading warning about conflicting v-structures from cpdag().
  * rsmax2() and mmhc() now return whitelists and blacklists as they are
     at the beginning restrict phase (thanks Vladimir Manewitsch).
  * bn.fit() can now fit local distributions in parallel, and has been mostly
     reimplemented in C for speed (thanks Claudia Vitolo).
  * added an impute() function to impute missing values from a bn.fit object.
  * fixed loss functions for data in which observations have to be dropped
     for various nodes (thanks Manuel Gomez Olmedo).
  * added an all.equal() method to compare bn.fit objects.
  * added a "by.node" argument to score() for decomposable scores (thanks
     Behjati Shahab).
  * added warning about partially direct graphs in choose.direction() and
     improved its debugging output (thanks Wei Kong).
  * added spouses(), ancestors() and descendats().
  * fixed a segfault in predict(..., method = "lw") with discrete BNs and
     sparse CPTs that included NaNs.

bnlearn (4.0)

  * fixed memory usage in aracne(), chow.liu() and tree.bayes() (thanks
     Sunkyung Kim).
  * rework memory management using calloc() and free() to avoid memory
     leaks arising from R_alloc() and missing memory barriers.
  * fixed a coefficients indexing bug in rbn() for conditional Gaussian
     nodes (thanks Vladimir Manewitsch).
  * added a mean() function to average bn.strength objects.
  * fixed S4 method creation on package load on MacOS X (thanks Dietmar
     Janetzko)
  * fixed more corner cases in the Castelo & Siebes prior, and increased
     numeric tolerance for prior probabilities.
  * allow non-uniform priors for the "mbde" score (thanks Robert Ness)
     and for "bdes".
  * the "mode" attribute in bn.strength objects it now named "method".
  * added posterior probabilities to the predictions for all discrete
     networks (thanks ShangKun Deng).
  * added the Steck's optimal ISS estimator for the BDe(u) score.
  * fixed the assignment of standard deviation in fitted CLG networks
     (thanks Rahul Swaminathan).
  * handle zero lambdas in the shrinkage Gaussian mutual information
     (thanks Piet Jones).
  * fixed segfault when computing posterior predictions from networks with
     NaNs in their conditional probability tables (thanks Giulio Caravagna).
  * fixed the assignment of LASSO models from the penalized package to
     fitted Gaussian networks (thanks Anthony Gualandri).
  * cpdag() now preserves the directions of arcs between continuous and
     discrete nodes in conditional linear Gaussian networks, and optionally
     also takes whitelists and blacklist into account (for any network).
  * several checks are now in place to prevent the inclusion of illegal
     arcs in conditional Gaussian networks.
  * renamed the "ignore.cycles" argument to "check.cycles" in arcs<-() and
     amat<-() for consistency with other functions such as set.arc().
  * added an "undirected" argument to mmpc() and si.hiton.pc(), which can now
     learn the CPDAG of the network instead of just the skeleton.
  * added a "directed" argument to acyclic().
  * removed unsupported argument "start" from learn.nbr().
  * handle interventions correctly in boot.strength() when using the mixed
     BDe score (thanks Petros Boutselis).
  * "bdes" is now named "bds" (it is not score equivalent, so the "e" did
    not belong).

bnlearn (3.9)

  * fixed alpha threshold truncation bug in conditional independence tests
     (thanks Janko Tackmann).
  * massive cleanup of the C code handling conditional independence tests.
  * fixed variance scaling bug for the mi-cg test (thanks Nicholas Mitsakakis).
  * in the exact t-test for correlation and in Fisher's Z, assume independence
     instead of returning an error when degrees of freedom are < 1.
  * fixed segfault in cpdist(..., method = "lw") when the evidence has
     probability zero.
  * added loss functions based on MAP predictions in bn.cv().
  * removed bn.moments() and bn.var(), they were basically unmaintained and had
     numerical stability problems.
  * added support for hold-out cross-validation in bn.cv().
  * added plot() methods for comparing the results of different bn.cv() calls.
  * permutation tests should return a p-value of 1 when one of the two
     variables being tested is constant (thanks Maxime Gasse).
  * improved handling of zero prior probabilities for arcs in the Castelo &
     Siebes prior, so that hc() and tabu() do not get stuck (thanks Jim Metz).
  * added an "effective" argument to compute the effective degrees of freedoms
     of the network, estimated with the number of non-zero free parameters.
  * fixed optional argument handling in rsmax2().
  * fixed more corner cases related to singular models in
     cpdist(..., method = "lw") and predict(..., method = "bayes-lw").
  * fixed Pearson's X^2 test, zero cells may have dropped too often in
     sparse contingency tables.
  * fixed floating point rounding issues in the shrinkage estimator for the
     Gaussian mutual information.

bnlearn (3.8.1)

  * fixed CPT import in read.net().
  * fixed penfit objects import from penalized (thanks John Noble).
  * fixed memory allocation corner case in BDe.

bnlearn (3.8)

  * reorder CPT dimensions as needed in custom.fit() (thanks Zheng Zhu).
  * fixed two uninitialized-memory bugs found by valgrind, one in
     predict() and one random.graph().
  * fixed wrong check for cluster objects (thanks Vladimir Manewitsch).
  * fixed the description of the alternative hypothesis for the
     Jonckheere-Terpstra test.
  * allow undirected cycles in whitelists for structure learning algorithms
     and let the algorithm learn arc directions (thanks Vladimir Manewitsch).
  * include sanitized whitelists (as opposed to those provided by the user)
     in bn.fit objects.
  * removed predict() methods for single-node objects, use the method for
     bn.fit objects instead.
  * various fixes in the monolithic C test functions.
  * fixed indexing bug in compare() (thanks Vladimir Manewitsch).
  * fixed false positives in cycle detection when adding edges to a graph
     (thanks Vladimir Manewitsch).
  * fixed prior handling in predict() for naive Bayes and TAN classifiers
     (thanks Vinay Bhat).
  * added configs() to construct configurations of discrete variables.
  * added sigma() to extract standard errors from bn.fit objects.

bnlearn (3.7.1)

  * small changes to make CRAN checks happy.

bnlearn (3.7)

  * fixed the default setting for the number of particles in cpquery()
     (thanks Nishanth Upadhyaya).
  * reimplemented common test patterns in monolithic C functions to speed
     up constraint-based algorithms.
  * added support for conditional linear Gaussian (CLG) networks.
  * fixed several recursion bugs in choose.direction().
  * make read.{bif,dsc,net}() consistent with the `$<-` method for bn.fit
     objects (thanks Felix Rios).
  * support empty networks in read.{bif,dsc,net}().
  * fixed bug in hc(), triggered when using both random restarts and the
     maxp argument (thanks Irene Kaplow).
  * correctly initialize the Castelo & Siebes prior (thanks Irene Kaplow).
  * change the prior distribution for the training variable in classifiers
     from the uniform prior to the fitted distribution in the
     bn.fit.{naive,tan} object, for consistency with gRain and e1071 (thanks
     Bojan Mihaljevic).
  * note AIC and BIC scaling in the documentation (thanks Thomas Lefevre).
  * note limitations of {white,black}lists in tree.bayes() (thanks Bojan
     Mihaljevic).
  * better input sanitization in custom.fit() and bn.fit<-().
  * fixed .Call stack imbalance in random restarts (thanks James Jensen).
  * note limitations of predict()ing from bn objects (thanks Florian Sieck).

bnlearn (3.6)

  * support rectangular nodes in {graphviz,strength}.plot().
  * fixed bug in hc(), random restarts occasionally introduced cycles in
     the graph (thanks Boris Freydin).
  * handle ordinal networks in as.grain(), treat variables as categorical
     (thanks Yannis Haralambous).
  * discretize() returns unordered factors for backward compatibility.
  * added write.dot() to export network structures as DOT files.
  * added mutual information and X^2 tests with adjusted degrees of freedom.
  * default vstruct() and cpdag() to moral = FALSE (thanks Jean-Baptiste
     Denis).
  * implemented posterior predictions in predict() using likelihood weighting.
  * prevent silent reuse of AIC penalization coefficient when computing BIC
     and vice versa (thanks MarĂ­a Luisa Matey).
  * added a "bn.cpdist" class and a "method" attribute to the random data
     generated by cpdist().
  * attach the weights to the return value of cpdist(..., method = "lw").
  * changed the default number of simulations in cp{query, dist}().
  * support interval and multiple-valued evidence for likelihood weighting
     in cp{query,dist}().
  * implemented dedup() to pre-process continuous data.
  * fixed a scalability bug in blacklist sanitization (thanks Dong Yeon Cho).
  * fixed permutation test support in relevant().
  * reimplemented the conditional.test() backend completely in C for
     speed, it is now called indep.test().

bnlearn (3.5)

  * fixed (again) function name collisions with the graph packages
     (thanks Carsten Krueger).
  * fixed some variable indexing issues in likelihood weighting.
  * removed bootstrap support from arc.strength(), use boot.strength()
     instead.
  * added set.edge() and drop.edge() to work with undirected arcs.
  * boot.strength() now has a parallelized implementation.
  * added support for non-uniform graph priors (Bayesian variable
     selection, Castelo & Siebes).
  * added a threshold for the maximum number of parents in hc() and tabu().
  * changed the default value of "moral" from FALSE to TRUE in cpdag()
     and vstructs() to ensure sensible results in model averaging.
  * added more sanity checks in cp{query,dist}() expression parsing
     (thanks Ofer Mendelevitch).
  * added 'nodes' and 'by.sample' arguments to logLik() for bn.fit objects.
  * support {naive,tree}.bayes() in bn.cv() (thanks Xin Zhou).
  * fixed predict() for ordinal networks (thanks Vitalie Spinu).
  * fixed zero variance handling in unconditional Jonckheere-Terpstra
     tests due to empty rows/columns (thanks Vitalie Spinu).
  * in bn.cv(), the default loss for classifiers is now classification
     error.
  * added a nodes<-() function to re-label nodes in bn and bn.fit object
     (based on a proof of concept by Vitalie Spinu).
  * replaced all calls to LENGTH() with length() in C code (thanks Brian
     Ripley).
  * default to an improper flat prior in predict() for classifiers for
     consistency (thanks Xin Zhou).
  * suggest the parallel package instead of snow (which still works fine).

bnlearn (3.4)

  * move the test counter into bnlearn's namespace.
  * include Tsamardinos' optimizations in mmpc(..., optimized = FALSE),
     but not backtracking, to make it comparable with other learning
     algorithms.
  * check whether the residuals and the fitted values are present
     before trying to plot a bn.fit{,.gnode} object.
  * fixed two integer overflows in factors' levels and degrees of
     freedom in large networks.
  * added {compelled,reversible}.arcs().
  * added the MSE and predictive correlation loss functions to bn.cv().
  * use the unbiased estimate of residual variance to compute the
     standard error in bn.fit(..., method = "mle") (thanks
     Jean-Baptiste Denis).
  * revised optimizations in constraint-based algorithms, removing
     most false positives by sacrificing speed.
  * fixed warning in cp{dist,query}().
  * added support for ordered factors.
  * implemented the Jonckheere-Terpstra test to support ordered
     factors in constraint-based structure learning.
  * added a plot() method for bn.strength objects containing
     bootstrapped confidence estimates; it prints their ECDF and
     the estimated significance threshold.
  * fixed dimension reduction in cpdist().
  * reimplemented Gaussian rbn() in C, it's now twice as fast.
  * improve precision and robustness of (partial) correlations.
  * remove the old network scripts for network that are now available
     from www.bnlearn.com/bnrepository.
  * implemented likelihood weighting in cp{dist,query}().

bnlearn (3.3)

  * fixed cpdag() and cextend(), which returned an error about
     the input graph being cyclic when it included the CPDAG of
     a shielded collider (thanks Jean-Baptiste Denis).
  * do not generate observations from redundant variables (those
     not in the upper closure of event and evidence) in cpdag()
     and cpquery().
  * added Pena's relevant() nodes identification.
  * make custom.fit() robust against floating point errors
     (thanks Jean-Baptiste Denis).
  * check v-structures do not introduce directed cycles in the
     graph when applying them (thanks Jean-Baptiste Denis).
  * fixed a buffer overflow in cextend() (thanks Jean-Baptiste
     Denis).
  * added a "strict" argument to cextend().
  * removed Depends on the graph package, which is in Suggests
     once more.
  * prefer the parallel package to snow, if it is available.
  * replace NaNs in bn.fit objects with uniform conditional
     probabilities when calling as.grain(), with a warning
     instead of an error.
  * remove reserved characters from levels in write.{dsc,bif,net}().
  * fix the Gaussian mutual information test (thanks Alex Lenkoski).

bnlearn (3.2)

  * fixed outstanding typo affecting the sequential Monte Carlo
     implementation of Pearson's X^2 (thanks Maxime Gasse).
  * switch from Margaritis' set of rules to the more standard
     Meek/Sprites set of rules, which are implemented in cpdag().
     Now the networks returned by constraint-based algorithms are
     guaranteed to be CPDAGs, which was not necessarily the case
     until now.
  * semiparametric tests now default to 100 permutations, not 5000.
  * make a local copy of rcont2() to make bnlearn compatible with
     both older and newer R versions.

bnlearn (3.1)

  * fixed all.equal(), it did not work as expected on networks
     that were identical save for the order of nodes or arcs.
  * added a "moral" argument to cpdag() and vstructs() to make
     those functions follow the different definitions of v-structure.
  * added support for graphs with 1 and 2 nodes.
  * fixed cpquery() handling of TRUE (this time for real).
  * handle more corner cases in dsep().
  * added a BIC method for bn and bn.fit objects.
  * added the semiparametric tests from Tsamardinos & Borboudakis
     (thanks Maxime Gasse).
  * added posterior probabilities to the predictions for
     {naive,tree}.bayes() models.
  * fixed buffer overflow in rbn() for discrete data.

Older Entries