This thesis consists of four papers studying structure learning and Bayesian inference in probabilistic graphical models for both undirected and directed acyclic graphs (DAGs). Paper A presents a novel algorithm, called the Christmas tree algorithm (CTA), that incrementally construct junction trees for decomposable graphs by adding one node at a time to the underlying graph. We prove that CTA with positive probability is able to generate all junction trees of any given number of underlying nodes. Importantly for practical applications, we show that the transition probability of the CTA kernel has a computationally tractable expression. Applications of the CTA transition kernel are demonstrated in a sequential Monte Carlo (SMC) setting for c...