Curve Fitting is not a term I'm familiar with in Statistic.

It seems like it's some kind of spline or non parametric approach really from what the article is trying to say but at the same time it's saying we know our parameters.

> You gather a set of data, you visualize it, create a fit and build a model around that fit so you can interpolate. Majority of the time, if not every time, you know exactly what parameters are in the dataset as they correspond to some physical event. Building fits help you extract a mathematical equation that will dictate how the event will act in the future given the parameters are the same. Since you know the parameters (and in the event you know how the event was setup), you can tailor your errors and uncertainties more carefully.

No in statistic you don't know the parameter so you're estimating it from the data via spline, kernel density estimation, random forest, etc.. hench it's either semi parametric or non parametric.

So this paragraph contradicts itself.

> Regression is a far more loaded term and has a lot of connections to machine learning. Admittedly, curve fitting also sounds simpler. Itâ€™s not. Regression analysis is most commonly used in forecasting and building predictions. It deals with the relationship between the independent variable and the dependent variables and how the dependent variables change when the independent variable is changed.

You can just say that regression is easier to interpret the relationship between predictors and response. Compare to say Random forest or Neural network.

Somehow your comment comes across as very innocent and cute :) just don't worry about it man, it wasn't directed at you, it is directed at physicists and to us it makes sense.

We do indeed approach most problems as parametric regression, and usually there are some bounds on the parameter values from outside the data. Sometimes this is called fitting a curve.