In 200 B.C. the Greeks observed that shadows cast by sticks in cities 800 km apart would have ~7° difference in angle. From this, they estimated the diameter of Earth to be 9,000 km, a number that is 30% lower than today's accepted value of 12,742 km.
The simplicity of the tools involved is impressive, but what if lacking two sticks we instead attempt the calculation using two pennies?
While on vacation, my view toward the Gulf of Mexico horizon coincidentally aligned with the deck railing and the slight curvature of the Earth became apparent. I stacked two pennies on the railing to approximate the deviation between the (presumed) straight railing and the curved horizon. Earth's diameter took ~4 hours to derive.
True value = 12,742 km (Penny calculation is 73.2% of the true value and 103% of the Greek estimation in 200 B.C.)
Which inputs if different (and by how much) would have produced exactly the true diameter?
If the view over the railing deviated by 1.17 pennies, not 2.0 (42% difference)
If the railing width were 3.78 m, not 3.5 m (8% difference)
If the viewer distance to railing were 1.53 m, not 1.8 m (18% difference)
If the elevation above sea level were 20.33 m, not 14.88 m (37% difference)
While I thought the number of pennies (or dimes or nickels) would be the dominating error, in fact I should have more carefully measured the railing width to which the calculation has 4ᵗʰ power sensitivity. Should the actual railing prove to be 8% wider railing than my estimate, a perfect calculation of Earth's diameter would have resulted.