When adding points to the regression plot, the matrix values used in the Normal Equation calculation (e.g., 𝑋𝑇𝑋) do not match the expected values based on the actual input data shown on the axes (e.g., values like 9.2, 23.9, 34.2). Instead, the script appears to use different internal representations. My intuition is that the coordinates may be taken from pixel positions or scaled screen-space values, not the logical data coordinates shown to the user.