Hitching a ride on a black hole, part IV: Going on an 10gnt
You might be wondering why I picked 10g as the star drive’s acceleration and whether it is gives you much vs a leisurely 1g or less. Let us deal with this question.
First, assuming perfect mass-to-energy conversion, how long can we keep accelerating before all the fuel is used up? (We will not be considering ramjet-style refueling by scooping interstellar medium.) An honest calculation would require some calculus, but we can skip that by noting that when rate of change as a fraction of magnitude is a constant, the magnitude decreases exponentially. A well-known example is radioactive decay.
So, suppose we start with a rocket ship with mass M. If every T second we eject m kg of fuel with velocity v, then the thrust pushing the rocket forward is and its acceleration is . After T seconds the rocket’s mass is , so to keep the acceleration constant we need to reduce thrust correspondingly, and continue to do so as the rocket gets lighter and lighter. As a result, the rocket’s mass decreases slower and slower with time, getting to about 37% of its original weight after seconds (and to 37% of that after another t seconds). For a light-speed exhaust this works out to be only 35 days at 10g acceleration.
This should give you pause. It sure made me check this simple calculation, just in case. With our star drive we would have to basically annihilate 60% of the star in just one month. For scale, the Sun converts not even 10% of its mass into energy by burning Hydrogen then Helium for over 10 billion years! And we plan to burn that much in one week! Or at least one week by the ship’s clock, as we would be close to the speed of light after only a few weeks, when time dilation firmly sets in. Still, this is very much comparable with supernova explosions, only going on for weeks non-stop.
Now would be a good time for environmental considerations. If the exhaust consists mostly of electromagnetic radiation, then whatever is in the path of the exhaust would not fare well, not within a few light years, anyway. Says Wikipedia:
Gamma rays induce a chemical reaction in the upper atmosphere, converting molecular nitrogen into nitrogen oxides, depleting the ozone layer enough to expose the surface to harmful solar and cosmic radiation
And that’s a few dozen light years away. Using near-light speed massive particles, like Hydrogen plasma jets is just as bad. CEPA, the Cosmic Environmental Protection Agency, might be a bit put out with us for destroying all life on a planet or ten.
Fortunately, there is a way out. Kind of. If the exhaust mostly consisted of neutrinos, the impact on the surrounding Cosmos would be minimal. Neutrinos go through everything almost completely unaffected, so even at a few million km the radiation exposure from them is quite small.
Interestingly enough, most of the energy released in a core-collapse supernova, the kind where a large star runs out of gas and collapses on itself until everything in it turns into neutrons, is in the form on neutrinos. However, this is just way too weak by our standards, as most of the energy is not released, but remains as its neutron core or the resulting black hole. Yes, that’s right, one of the most dreadful star explosions we observe, visible a few galaxies over, just doesn’t produce enough energy to power our star drive.
There are good reasons that it is very hard to turn normal matter into neutrinos completely, most of them are related to conservation laws. Specifically, we currently do not know of any way to turn “quarks”, the stuff of which atomic nuclei are made into “leptons”, which is what neutrinos are. It is entirely possible that as-yet-unknown laws of physics come into play at very high energies, but, if so, this energy would have to be higher than anything a mere supernova can produce, since we don’t see supernovae disappear in a puff of neutrinos.
One bit of good news is that this is not an issue for black/white holes, as quark number is not conserved (or at least not visible) after the black hole is formed. Only mass, angular momentum and electric charge are preserved by the collapse. So there is a hope that if we give up on “real” stars, like white dwarfs and neutron stars, and build our star drive out of the black/white hole combo instead, then there is no restriction on the type of material emitted.
Anyway, back to our calculations. Suppose we have run at full power for some time, and burned the fraction X of our star drive. How fast are we going now? Well, almost at the speed of light if X is large enough. What is more interesting is not the exact speed, but the time dilation/space contraction factor. Because this factor is what tells you how fast you get to where you are going. Like, if the factor is 10, then you get across 10 light years distance in only in one year local time.
So the space contraction factor for a 100%-efficient rocket with light-speed exhaust is . While the expression itself is simple, I have not found an easy way to derive it, without cranking through the relativistic rocket calculations, so please forgive me for just dropping it here without a proper motivation. And if you know of a way, please comment.
Let’s plug in some numbers! As we have seen earlier, after one month of travel at 10g (or after almost a full year at just 1g) we are left with 40% of the fuel we had at the start. Plugging in X=0.4 in the time-dilation formula above, we get , not a very impressive number. It only reduces our subjective travel time by barely 30%. If we give it another month, we are down to X=0.16 and . Only when we burn away 95% of the star we get 10 times for our time dilation effect. How long will it take at 10g? About 3 months ship time.
So, we accelerate until our time dilation factor is 10, then what? Turn off the star drive and cruise? But then we are left weightless for most of the flight, until we start decelerating, which kind of defeats the purpose of our ingenious setup to provide a steady 1g for the crew. Also, how long would we have to cruise weightless? With the time dilation factor of 10 it would take full 3 years to get to, say, Vega, if you count acceleration and deceleration. Including 2.5 years with the drive off. And after decelerating we are left with 5% of 5% or just one quarter of one percent of the original star. Somewhat wasteful, is it not? We used up a whole star in a most efficient way imaginable to get not across the Universe, not across the Galaxy, but across less than one one-thousandth of the Milky Way. And it took us three years to do it.
To summarize the disappointing answer to our original question, 10g acceleration gets us the time dilation/space contraction factor of 10 if we accelerate for 3 months and burn away 95% of the fuel. But wait! Not all is lost quite yet. This result is for a conventional rocket, not a black-hole joy ride, where the effects of General Relativity are also important. We will see if it makes any difference next time.