I'm talking about Big O Notation analysis of algorithms.
“Although developed as a part of pure mathematics, this notation is now frequently also used in the analysis of algorithms to describe an algorithm's usage of computational resources: the worst case or average case running time or memory usage of an algorithm is often expressed as a function of the length of its input using big O notation. This allows algorithm designers to predict the behavior of their algorithms and to determine which of multiple algorithms to use, in a way that is independent of computer architecture or clock rate. Because big O notation discards multiplicative constants on the running time, and ignores efficiency for low input sizes, it does not always reveal the fastest algorithm in practice or for practically-sized data sets, but the approach is still very effective for comparing the scalability of various algorithms as input sizes become large.” - [http://en.wikipedia.org/wiki/Big_O_notation]
We are still talking about code, specifically your code and how well it performs when pushed to it's limits and more specifically help you understand it's limits and plan for what will happen when those are reached. I'm not going into a full class on Big O, I'm just highlighting an example from code I am working on at the moment that reminded me of Big O.
At CF Webtools we have taken on the task of doing RETS [http://www.rets.org/] data import. I have over 5 years of experience in developing Real Estate search platforms and working with Real Estate data. However, I am not a RETS expert although many of my former co-workers are and I discussed the issues and methods of RETS with them on a regular basis. Taking on a RETS import, writing your own RETS client and managing the data is not for the weak of code.