Here is a physics problem that I made up.
Two Israeli fighter jets are flying in Suadi Arabia. One is heading towards mecca and the other towards medina. Each aircraft has a velocity of 600 knots and an altitude of 1500ft above ground level. Ignoring friction, how far away from these unholy cities do the planes have to drop their bombs.
The first step is to convert our units. 1500 ft is approximately 457.2 meters. 600 Knots is about 308.66 m/s. There are six equations that we need to solve. For the vertical coordinate system:
Ay(t) = -9.81 m/s
Vy(t) = -9.81*t +vy(0) ms, where vy(0) = 0 because the aircraft is flying level wrt the ground
Dy(t) = -9.81/2*t^2 + dy(0), where dy(0) is the height of the aircraft.
With this information it can be determined that the time it takes for the bomb to hit the ground is 9.65 seconds.
For the horizontal coordinate system
Ax(t) = 0 since the bomb is not equip with thrusters
Vx(t) = 308.66 m/s since that is the initial velocity and neither friction nor thrust is present.
Dx(t) is the unknown which is also Vx(t)*t, which is 2.97 km
Convering this back into nautical miles, the answer is 1.6 nautical miles.