back

Revolut Publishes PRAGMA, a Foundation Model Trained on 40 Billion Banking Events

2026-04-14 07:04

Revolut released a paper on PRAGMA, a family of Transformer-based foundation models pre-trained with masked modelling on a corpus of 40B+ banking events from 25M users. The three model sizes (10M, 100M, 1B parameters) are tuned for different latency/precision trade-offs and achieve strong results on credit scoring, fraud detection, and lifetime value prediction — often via a simple linear head on top of the frozen embeddings. The architecture tokenizes tabular financial data and applies LLM-style training, creating reusable embeddings across financial crime, product cross-sell, and credit risk.

Citations