Is there anyway to model groundwater well transmissivity?
i.e. the more flow you pull from a reservoir, the lower the HGL of said reservoir.
The only pattern I can seem to apply to reservoirs is a time-based one, which is only applicable in EPS. It seems to me that there should be a way to model transmissivity for steady state solution runs based on well (or 'reservoir) discharge.