Transition Network Analysis

If anything fascinates me most, it is the dynamics of time — how and why a learning event or a process unfolds, progresses, regresses, or gives rise to something altogether new. This fascination led to the development of Transition Network Analysis (TNA), a novel framework that I introduced with colleagues to capture the temporal unfolding of learning processes with unprecedented rigor. TNA combines the probabilistic foundations of Markov models with the rich analytical vocabulary of network science, enabling researchers to model, visualize, and statistically test sequential transitions between states.

TNA offers a comprehensive suite of tools spanning model-level diagnostics, edge-level significance testing through bootstrapping and permutation, centrality-based identification of pivotal states, and formal comparison of process models across groups or conditions. More recently, Frequency-Based TNA (FTNA) extends this framework to settings where descriptive summaries of observed transition counts matter more than probabilistic assumptions. Our work also explores the complex dynamics of human–human, human–AI, and AI–AI interactions, uncovering behavioral patterns that inform human-centric approaches to AI-enhanced learning.

Selected Publications

← Back to Interests Overview