
When Graphs Have Gaps: LIFAGU Finds Symmetry and Speeds Up Inference
26 Aug 2025
Experiments show LIFAGU achieves near‑zero error vs. ground truth and speeds up inference via lifting, generalizing colour passing to unknown factors.

When Some Factors Go Missing, LIFAGU Finds the Symmetries
26 Aug 2025
The LIFAGU algorithm transfers potentials from known to unknown factors via structural symmetry, generalizing colour passing and enabling lifted inference.

LIFAGU: Lifted Probabilistic Inference in Factor Graphs with Unknown Factors
25 Aug 2025
This paper introduces LIFAGU, a generalization of colour passing to lift factor graphs with unknown factors, enabling exact probabilistic inference.

How Fast Is PyJuice? Testing Compilation Speed Across GPUs and Batch Sizes
25 Aug 2025
PyJuice benchmarks show 30s compilation for 1B+ parameters and faster GPU runtimes than baselines across batch sizes and hardware.

What Happens When You Drop 32×32 Blocks in a PC Layer?
25 Aug 2025
Detailed walkthrough of training HMMs, sparse image models, and block-sparse PC layers with GPT-2 fine-tuning and WikiText-103 benchmarks.

How PyJuice Handles Block-Sparse Structures and Tied Parameters
25 Aug 2025
Explore PyJuice’s algorithm design—from layer partitioning and backpropagation to tied parameters and block-sparse probabilistic circuits.

Numerical Validation of UAV‑CRN Optimization: Improved Rates Under Energy and PLoS Constraints
25 Aug 2025
Simulations validate the proposed UAV‑CRN optimization algorithm, showing improved rates, convergence, and insights into propulsion power and IT constraints.

The Future of Tractable Deep Generative Models
24 Aug 2025
PyJuice sets new standards for probabilistic circuits—faster, leaner, and more reproducible benchmarks for generative AI research.

PyJuice Pushes HMMs and Image Models Beyond State-of-the-Art
24 Aug 2025
PyJuice outpaces SPFlow, Juice.jl, and others—training billion-edge probabilistic circuits with unmatched speed and memory efficiency.