Sorry - just back from a gig. Some Hollywood band called Evil Beaver. (Yeah - some band! I saw them last week at another gig. Very impressed. Not everyone's cup of coffee being merely a drummer and a bassist/singer.)
Re inverters (and amps etc), a rough rule (or ROT = Rule Of Thumb) is to divide their output (Watts) to get input DC Amps. Hence a 2,000W inverter (or amp) will take up to or about 200A at typical 12V vehicle voltages under FULL load. That's more than most highbeams and may be of a similar Amperage to starter motors.
But that's at FULL output. (And it's RMS aka average output - peak Watts is irrelevant!)
And if you only run a 50W or 90W PC then it's only 5A - 9A instead of 200A... PLUS the inverter overhead ie standby current which these days is usually small - maybe a few Amps.
FYI - My first inverters were 150W etc for the odd 240VAC chargers I had. And of course the all important coffee grinder.
Though those 150W inverters have now been replaced by 600W inverters, my current consumption is the same because the loads have not increased. In fact the load is probably less since the newer 600W inverters have a lower standby (aka idling) current as well as better efficiency. (The latter despite inverters typically being most efficient at say 70-89% of full load, but since overall efficiency of the newer is better, a 10-20% loading of the 600W inverter is less than or similar to the 40-80% loading of the 150W inverter.)
I think you understand re heat and lifecycle/lifetime. If NOT running, the temp essentially does not matter. (Provided its "storage" temp specs are not exceeded. But whilst a device may only be rated to 85C for OPERATION, it may have a storage temp up to 110C etc.)
And essentially a device has a certain temperature rise during operation (at a specific load). Put that in an engine bay which is say 50C hotter than the cabin, then add 50C to its cabin operating temperature - hence a 32x shorter lifespan in simple theory based on half the life per 10C increase. (It will actually be cooler due to increase infrared heat losses, but hey, this is supposed to be a KIS else worst-case deign.)
But if a component exceeds its max operating temperature, then it's instant death. Many components have an 85C rating thou some have 105C etc (eg, some capacitors).
And engine bay temps vary greatly. Since most thermostats are rated at 85C, that means a coolant exit temp of at least 85C. If 85C, the air temp of air coming thru the radiator will be a MAX of 85C, but it might only be 60C etc depending on air speed. But then there might be cooler air from under the engine, but maybe (or probably) not in the upper engine bay where <whatever> is mounted. And then there's the MUCH hotter air coming from the hotter (than 85C) exhaust and splugs and maybe head. But if the sides of the engine bay are cooler due to wheel-well cooling and front venting... [Tho radiator cooled vehicles are usually designed to pull air thru the radiator - hence no other venting, and hence radiator shrouds etc. (Hence why removing shrouds or having bonnets partly open may result in engine overheating.)]
Geez I can go/ramble on. And on...!
In short, avoid the engine bay. Tho as SNO suggested, an enclosure - which is ALSO vented from outside - WITHOUT letting in water or dust & contaminants... (And IMO filters have their own hazards, but I'll skip that ramble!)
How would I design it? Let's assume a rear mount for all the reasons above.
A 22' run - lets make that 10m (33').
Assume my (ideal) design limit of 0.5V max drop.
And let's go the full 2kW for now.
Hence 200A at 0.5V (max drop). V=IR hence R=V/I = 0.5/200A = 2.5 mR (milli-Ohms) for 10m (=0.25mR/m) = 250mR = 0.25R per 1km.
From powerstream's 'merkan Wire Gauge Table, 00G is 0.255512 R/km.
Damn - too big (a cable) for my usual design approach.
Ok, defeated for now...
Let's assume 2G and be more accurate.
2G = 0.1563 R per 1000 feet or mR per foot. (Dear Lord, forgive me for using Imperial! But hey Man Esq, let's face it - it makes sense in this case! PS - IMHO of course!)
Hence 22 x 0.1563 = 3.4386 = 3.5mR.
200A a 3.5mR = 700mV = ~0.7V for that 22' run at full 2kW output.
Ok, not too bad...
Let's assume a total 1V drop at full load taking into account a fuse & various connection losses BUT assuming a negligible GND resistance - ie, a short run of 2G or better (2 x 2G?) to body/chassis which gives a near zero resistance to battery -ve.
Many inverters cut out at 10.5V (usually to protect the battery, but can also be to protect the inverter), hence with 2G, full 2kW output should be achievable with a battery terminal voltage of 11.5V and higher.
I'll leave it there because the next step needs definition - eg, what is your battery terminal voltage @ 200A - and that depends if the alternator is charging (and what it outputs at the relevant RPM) or not.
But assuming it's the only load with engine off and the battery is fully charged (say 12.7V internal voltage), that means a batter internal resistance of less than [12.7-11.5V = 1.2V/200A = ] 6mR which essentially means an AGM battery - and that's only for start up. How long it lasts depends on batter capacity. (Tho AGMs are usually ~12.8V internal voltage, but that's only minutes difference.)
Ooops - I was going to leave it yet I started the next step's confustigation.