Here we use a high-resolution large eddy simulation coupled with bin microphysics to simulate a precipitating marine cumulus field and identify the required radiometric precision to derive the LWP using a radar/radiometer system. The simulations are used to examine the error characteristics of the total water path retrieved from the integral constraint of the passive microwave brightness temperature using a spatial interpolation technique. Three sources of bias are considered: 1) the misdetection of cloudy pixels as clear, 2) the systematic differences in the column water vapor between cloudy and clear skies, and 3) the nonuniform beam-filling effects on the observables. The first two sources result in biases on the order of 5-10 gm^2 of opposite signs that tend to cancel. The third source results in a bias that increases monotonically with the water path that approaches 50%. Nonuniform beam-filling is sensitive to footprint size. Random error results from both instrument measurement precision and the natural variability in the relationship between the water path and the observables. Random errors for the retrievals using a radar/radiometer system with a measurement precision of 0.3K would produce errors of the larger of either 10 gm^2 or 30% at the pixel scale.