High speed data streams, ever more prevalent in our daily lives, are almost never stationary. In Web marketing applications, click-through data changes with the hour of the day and changes drastically on the weekend. The same phenomenon occurs in domains as diverse as traffic control, power grids, and stock trading. However, even the simplest change detection problem - the detection of changes in the mean of the distribution - is a multifaceted problem in which the number of false positives, the number of samples needed, the accuracy at which the change point is identified, and the computational resources needed, each have a cost and can all be traded against each other. We present a new mean change detection algorithm suitable for high speed data streams. The algorithm uses probabilistic bounds on the value to which a test statistics would converge in the long term to focus only on those points in the prefix of the stream at which a change might have occurred. We show that this selection limits the expected computational overhead per new sample to a constant, which is equivalent to that of the fastest known algorithms. On the other hand, we show that the detection accuracy, the detection delay, and the rate of false-positives of our new algorithm are all far better than those of those predecessors.