Model Predictive Control (MPC) is widely used in the process industry for its superior control performance. However, its real-time computational demands limit implementation in systems with fast dynamics. To address this, we propose an offline approach to approximate MPC control laws using oblique decision trees (DTs) with linear predictions, which are then deployed as online controllers. Unlike Explicit MPC—which suffers from scalability issues due to the exponential growth of partitions—DTs offer both interpretability through if-else rules and scalability via data-driven training, with datasets generated from ideal MPC simulations. Notably, DTs with oblique splits and linear leaf predictions mirror the piecewise affine structure of explicit MPC. A key challenge is training the DT model, a mixed-integer problem. We tackle this with a novel gradient-based algorithm, enabling efficient training with GPU-accelerated machine-learning tools. Through case studies, we demonstrate that this method accurately approximates both linear and nonlinear MPC control laws, significantly reducing online computation time while maintaining control performance.