Summary
NLP and Graph Representation Learning
I work on language modelling and structured reasoning problems where graph representations help make context, relationships, and evaluation more explicit.
Why it matters
Many language tasks depend on structure across entities, documents, or contexts. Graph-based methods help make those relationships explicit and easier to evaluate.
Methods
Tools
Evidence
Current postdoctoral work in Cardiff focuses on NLP and graph representation learning, with GPU-accelerated training and evaluation in Python, PyTorch, and CUDA.
What I worked onExpand
I build reproducible Python and PyTorch workflows for NLP experiments that combine graph representation learning, GPU-accelerated training, and clear evaluation logic.
This is the part of my work that maps most directly to research-engineer and applied-scientist roles. It combines current NLP work with graph-aware representations and a practical emphasis on experiment design, comparative baselines, and measurable iteration.
In practice, I take open-ended modelling questions, turn them into tractable workflows, and produce evidence that other researchers or technical teams can build on.
Methods and tools
This work combines modern ML tooling, structured representations, and reproducible experimentation.




