Hello Devrim,
For me scaling is of two types
A. Normalization
B. Min max scaling
— — — — — — — — — — — — — — — — — — — — — — — — — — — — — —
A. Normalization :
In classical statistical sense, Normalization means (x- mu)/sd. When statisticians says “normalize” they mean to say a distribution having mean =0 and variance =1. This is also known as standard normal distribution. Any normal random variable X can be transformed into a standard score or z score via the equation (x- mu)/sd. If you think Etymologically , it makes sense to call the above process ‘normalization’ since normal here refers to the ‘standard normal’.
B. Min Max scaling (informally referred as normalization but I would call it ‘standardizing’)
Coming to other type of scaling, the below formula depicts min max scaling.

The above formula helps in scaling the values in the range of [0,1] and in a true sense ‘standardizes’ the values.
Think about how is scaling the values between [0,1] normal or Normalization ? It simply does not stick or make sense. I would hence rather call this process ‘standardizing’ as things are standardized where in the values range between 0 and 1.
I know informally many links referred by the author do call the above interchangeably but if you were to talk to a statistician he/she would get confused and perhaps would not approve of these definitions.
Sometimes, things gets lost or misinterpreted in translation. I think this is a classic case where a statistical concept gets wrongly defined by non statisticians when it comes into a different field. I also believe since this is a pure statistical concept being applied here, one needs to define things properly.
I hope I have clarified my point of view.