We did not prove Theorem 2.1 (Supporting Hyperplane Theorem). It is an obvious theorem, but actually fairly tricky to prove. The proof can be found in Boyd and Vandenberghe 2.5. One first proves the "separating hyperplane theorem" which says that two non-intersecting convex sets can always be separated by a hyperplane. The supporting hyperplane theorem follows by applying the separating hyperplane theorem to two specific sets. One is the set of points $\{(c,y): y>\phi(c), c\in \mathbb{R}^m\}$, and the the other is the set consisting of the single point $(b,\phi(b))$. The first of these sets is convex if $\phi$ is convex.

I have revised the proof of Theorem 2.4 to make it shorter, but I believe it is still clear.

I did talk to Section 2.5 today. I will return to the subject of

I have revised the proof of Theorem 2.4 to make it shorter, but I believe it is still clear.

I did talk to Section 2.5 today. I will return to the subject of

**shadow prices**another time.**Water-filling solution.**The problem I referred to as having a "water-filling solution" is motivated by a problem in information theory. Suppose one is trying to apportion a constrained amount of total transmission power across a number of noisy Gaussian communication channels. In channel $i$ the received signal is $Y_i=x_i+Z_i$, where $x_i$ is input signal, for which on-average $x_i^2=p_i$, and $Z_i\sim N(0,n_i)$. When the power to noise ratio in the $i$th channel is $p_i/n_i$ then the capacity of this channel is proportional to $\log_2(1+p_i/n_i)$. The problem of maximizing the $\sum_i\log_2(1+p_i/N_i)$ is that of distributing a fixed amount of power $\sum_ip_i=P$, say, across multiple channels so as to maximize the total communication capacity of these channels, i.e. the rate at which information can be reliably transmitted.