The boiling oceans due to down welling long wave infrared demon radiation from hell myth will not die! It is pretty funny. There are still "experiments" mainly thought experiments because anyone actually thinking would do an experiment before commenting, being mentioned on the internet, to "prove" that DWLR heats water.
Dr. Strangelove (his chosen alias) recommended one that was pretty reasonable and easy to do at home should you become seriously bored. Slightly modified the experiment goes like this. Take a glass of water filled to an inch below the rim and set an iron on top. Most folks have water, real glass glasses and access to a dusty electric clothes iron.
Now Dr. Strangelove defined two hypothesis, 1) the water would evaporate and 2) the water would boil. I recommend 3) you will become bored to death watching not much of anything happen.
So let's look at a little thermo. Each gram of has a specific heat capacity of about 4.2 Joules/(gram*K) if you have a large liter glass or bowl, you can fill with one liter (1000grams) of pure water and have about a half inch of room at the top. Assuming room and water temperature are about 25C degrees, you would need to add about 4.2 J/(gK) times 1000grams times 75 K degrees to raise the water to its boiling point which is a total of 315,000 Joules or 315 KJ. That is all the energy required provided you have 100% ideal heat transfer just to get the water to the boiling point.
The world in a less than ideal place so there will be issues. First would be convection, second would be evaporation and third would be radiant heat loss. The glass at 25C just sitting on the counter minding its own business is exchanging energy with its surroundings. If the water is at 25C and the air temperature at 25C the net transfer is near zero, but there is still some water evaporating unless the glass is sealed or the room is at 100% relative humidity. Each gram of water that evaporates has to absorb about 2260 Joules of energy in order to change phase to water vapor. As the water warms the saturation vapor pressure increases, increasing the rate of evaporation and each gram of water lost takes with it 2260 Joules plus 4.2 Joules for each degree it is above the initial 25 C degrees. Now you can figure the total energy required to evaporate all the water. 2260 times 1000 plus the 315KJ just to sensibly raise the temperature to the boiling point. That is a total of 2575 K Joules boil all the water off.
It actually takes less energy to get the water to the boiling point than it takes to evaporate all of the water provided convection and evaporation is limited. The main concern is rates of heat transfer or thermal efficiency.
Heating from above the surface is counter the direction of convection thanks to gravity. Warm moist air rises and the warmer the air the more moisture it can contain. Since we have an initial temperature of 25C(298K) and an iron we can assume is approximately 200C(473K) on the cotton setting, the maximum heat transfer can be estimated using Carnot Efficiency = 1-Tc/Th= 1-298/473 = 0.37 or 37% of the energy under ideal conditions is the best you can hope to transfer. Using the Stefan-Boltzmann law with a source temperature of 473K degrees the effective energy under ideal conditions would be 2838 Wm-2 and the sink energy at 298 K being 447 Wm-2 the difference is 2391 Wm-2 times 37% equals 884 Wm-2 ideal Carnot efficiency transfer. That efficiency decreases as the sink or liter of water temperature increases, so the maximum energy that the water can be raised to based on boiling at 100C (373K) 1-373/474=0.21 or 21 percent efficiency 368 Wm-2 energy transfer using the Carnot efficiency.
Imagine that instead of radiantly heating from over head, we can immerse the iron in the water. The iron is rated at about 1000Watts or 1 kWh which converts to 3,600,000 Joules per hour. The iron as an immersion heater could take "as little as" 60(2.575/3.6)=43 minutes to boil out the water where radiantly, under ideal conditions, (2,575,000/368)/60= "up to" 116 minutes to "evaporate" the water out, since efficiency decreases with increased temperature in both cases, much less in the first though.
The fun part about the over head radiant heat is that water vapor has a strong IR spectrum. As water evaporates the moisture absorbs radiation above the surface locally heating the air and blocking a portion of the IR energy from reaching the true surface. This localized warming stimulates convection reducing the effective surface pressure increasing the rate of evaporation. We have a situation where evaporative cooling is stimulated so the bulk of the water in the glass may not warm at all with direct long wave radiant heat applied, not just the "back radiation". With cold water in the bowl and less water vapor in the air, the direct IR radiation should be more efficient and actually do a little warming.
If you decide to do this experiment at home remember firstly that I am irresponsible er not responsible for any domestic situations that may result and that you should not be surprised if the water temperature in the glass/bowl never gets over the 30 to 35 C range without conduction being involved. I haven't had the patience to do this experiment, so let me know how things work out and let me know if you find any typos/errors that hinder your scientific creativity.