Monday, February 27, 2012

The material density map

Now we've got something (a fractal-noise image) that reasonably approximates, for our purposes, the elemental-matter-energy density of a galaxy. This would mean that a 1000x1000 pixel image represents an entire galaxy, so if we were making a galaxy the size of the Milky Way each pixel would represent roughly 600 trillion miles or 9.5×10^17 kilometers of space. That's a lot. This is only a proof-of-concept, and I'm not ready to start diving into that kind of scale. I'm going to reduce our galaxy size to something a little more manageable. Let's say about 1/1,000,000,000,000,000th of that size. This is a galaxy about 600 miles or 1000 kilometers across where each pixel represents about 0.6 miles squared (1 km squared). Everything in this galaxy will (probably) be scaled down appropriately: planets, stars, etc.

(Note: I can generate density maps larger than 1000x1000, which would give us much more fine-grained detail about the galaxy, but this is just a starting point. Plus, increasing the size of the image causes pattern generation to take exponentially longer amounts of time and currently results in images that are usually too dim to be very usable. This is something I'll come back to address later.)

Next we need a starting number. This number will be a reference point to the amount of energy and mass that exists in this galaxy (remember they're the same thing both in reality and, very transparently, in our model). We can produce this matter-energy (ME) number by simply totaling the values of each point in our density map. Here's an example number: 32329096.

Now we need to take our density map and divide it up. Each pixel is 1 sq. km, which we need to divide into a bunch of more fine-grained units--basically what will become tiles on a local area map, or cell. A square meter is obviously a good first choice. This means that each pixel on our density map represents a cell of 1 sq. km. * 1000 meters/km = 1,000,000 meters squared. This is a little more than 1/100th of the size of the entire Manhattan area--per pixel, one million times, which means our galaxy is about 1/10th the size of the entire US.

So, say we're examining a pixel in the dead-center of the density map. We can estimate its individual ME value at about 32329096 / (1000*1000) = ~32.33. This small number is the amount of matter and energy we have available to distribute around the cell the pixel on the density map represents. For the time being, we're not going to allow values to distribute across boundaries, so each cell will be in a sense self-contained.

Now the noise functions mentioned in the previous post become really useful. We can generate a 1000x1000 pixel (these represent meters, now) noise map and scale it to our cell's ME value as it relates to the maximum (and minimum) ME values of the entire galaxy, giving us a proportionate and appropriately random (or determined, if we want to direct it a bit) distribution of matter and energy in the cell.

What we need is a noise map that, when we total all its values, equals the local ME value. If our local ME value is 32.33, then we want to produce a noise map where each point's value is, on average, equal to 32.33 / (1000*1000) = 0.00003233. We can do this by producing a random noise map with random values and using a little math to map the values to our desired range. This gives us a noise map with a total ME value equaling its corresponding ME value in the density map. And, if we produce an images from this by normalizing each point in the resulting noise map by the scaled maximum of any point (the density map point with the highest ME value), we can see that each cell has a "brightness" to match:

A cell with 10% of the highest cell's ME value

25%

50%

75%

But herein lies the rub. We can't do this ahead of time, because while each noise map only takes about half a second to produce (that's even using a fast cosine function during interpolation--using the standard Java one takes a second), we're producing one for each pixel on a density map of 1000x1000 pixels. Even if we filter out the pixels with zero values, at least ~50% of the pixels still have a value and it would take about 65 hours to produce every noise map (half that if we use less-pretty linear interpolation--and still far too long). That's apart from the fact that we certainly couldn't hold them all in memory, and they'd eat up about 2 terabytes of HD space. Nope. We need to produce them only as needed. I will begin to address this next.

No comments:

Post a Comment