The internal debate:
A. Lab only (with a minor exception for legit DNS lookups) with all traffic hitting mock services on a mock Internet
B. Uber-throttled, hardened, and ham strung Internet connection to improve scope and accuracy of analysis
The obvious knock on B is that "You put others on the Internet at risk". Really? If the outbound ports are severely limited, connections throttled, and duration capped, isn't that risk wildly overstated? At that point, what is the difference between approach B and manually wget'ing links on a live system, only to move those files into the lab, rinse, and repeat?