Probably the most commonly used way to estimate a trend in something is a mathematical process called linear regression. Basically, it means to fit a straight line [for those who must be pedantic, a flat hyperplane if we have multiple predictor variables]. In the case of time series, use time as the predictor variable and look for a linear relationship. If we find it, we declare “Trend!” and might even posit how big it is.
Why linear? Does anybody really believe that global average temperature since, say, 1970 has followed a straight line? Couldn’t it have wiggled around a little, just a little maybe — not noise, mind you, but genuine signal, real climate change rather than random fluctuation? Might it actually have accelerated, or even decelerated, or — heavens forbid! — taken a “hiatus”? Hell, mightn’t there have been brief episodes of all three, just not strong enough to be detected statistically (for a stickler like me)?
Of course. To my mind, the idea that as far as global temperature goes the climate — the signal, not the noise — followed a perfect straight line, is ludicrous.