Post has attachment
Before I left for the science castle, I left the M33 data cube in VR rendering with higher resolution and wider camera separation for greater depth effect. I also saved it as an mp4, which means I can add the necessary metadata for this to work on YouTube. So you don't need a headset for this one, you can view it in regular browsers and use the mouse to look around. You can even turn on red-green 3D if you really want. Of course it's a lot better if you use an actual headset, and having it on YouTube should make viewing it a lot easier that way too.

This is still just a proof-of-concept test but I'm quite happy with the result. It could be fun to make this into a more fully developed, explanatory tour. What would be really nice would be to use the full AGES cube (this is only about 3% of the total), though that will require a different technique because otherwise I'll exceed the 1,024 image texture limit in Blender. I'll see how well Cycles handles image sequences as volumetrics. In the meantime I have a more detailed data cube to try out.
Add a comment...

Post has attachment
Some major improvements to the M33 VR render. Low resolution video but that really doesn't matter because the data is low resolution anyway. Much better colour scheme so you see a lot more detail in this one, and the data range now shows enough noise to give a better sense of depth. Plus the colour scheme is just much prettier.

Unfortunately I forgot to render this in the .mp4 format required for YouTube so you'll probably still have to download this one, and it's only suitable for headsets. I'll try and get the YouTube version working next week.

I think this would be a very nice way to give a tour through a data cube. A full AGES data cube (e.g. https://www.youtube.com/watch?v=1YWGZhXe_gA) would be a lot of fun, but that would require breaking the image texture limit that Blender < 2.78 can handle. So either I reinstall Linux on my works machine, or try and get the images sequences to process as Cycles volumetrics instead of textured planes.
Add a comment...

Post has attachment
Proof of concept : M33 HI data cube in VR. Has a lot of little flaws but the basic concept works : you fly through the data, it's visibly 3D, and you get full 360 coverage. Needs a headset to view this one. Once I iron out the problems (data has been smoothed too much, the colour scheme gets rid of too much noise which would provide useful reference points for depth information, and there's probably too much saturation) I'll upload to YouTube with metadata, so you can pan around in a regular web browser.

This one is created using the bare minimum display code of FRELLED (http://www.rhysy.net/frelled-1.html) converted to use Blender's Cycles engine, which can handle the equirectangular camera format needed for 360 spherical stereo video. This has to be rendered rather than using realtime capture (though the Cycles camera supports and equirectangular display in the realtime preview, it doesn't seem to allow for capturing preview animations like the OpenGL view does). It also requires having all three projections visible at once, so this is rather slow.

Previously I was hell-bent on getting the ALFALFA data catalogue rendered in VR, but the limitation was that Blender versions 2.78 and below don't allow more than 1,000 image textures. And my work machine, which can comfortably handle intensive processing jobs for days on end without batting an eye, won't let me install 2.79 (which doesn't have a texture limit) unless I do so massive upgrading of my Linux installation. Fortunately, while many HI data cubes have the equivalent of more than 1,000 images, most of them don't need it - in fact, removing most of them actually results in a more detailed, less saturated appearance of the final renders.

More on the data cube on display here :
http://astrorhysy.blogspot.com/2015/11/keenans-ring.html
Add a comment...

Post has attachment
Thirty Thousand Galaxies

I'm still working on getting this into VR format, but as that's hitting a wall for the moment, here it is conventional format. These are the 30,087 galaxies from the complete ALFALFA catalogue, detected in neutral hydrogen but here showing their optical components. The images are to scale (with a few that are wildly inaccurate) but exaggerated in size by a factor 50. More details here :
http://astrorhysy.blogspot.com/2013/03/galaxies-are-pretty.html

Minor adjustments : I'm using ALFALFA's estimate of distance, which is more sophisticated than just using the redshift directly. For size I'm no longer using the Petrosian radius thingy from the SDSS as that's crap, but assuming a constant surface density for the hydrogen instead. Works much better, though it still fails from time to time so you'll see a few clipped images. That probably happens if the galaxy has much less gas than normal.

The VR format has hit a frustrating point. The standard version (below) is easy and fast to render. The VR one requires Cycles, which does this really stupid synchronising objects and loading images thing that's much, much slower than the actual rendering process. By default it takes about 6 minutes to prepare and then about three seconds to render. By some clever tricks I've got that down to 2 minutes. Normally I'd throw that on to my beefy work machine and let it chug away for a while, but that's not possible here. I need Blender 2.79 for this, as previous versions limited the number of image textures to 1024. Can't run that version on my work computer as 2.79 is compiled against an updated version of glibc which means I'd have to update the OS. Can't render in passes either as for some reason, GOD KNOWS WHY, images with transparent backgrounds don't render correctly in Cycles.

These things were sent to try us...
Add a comment...

Post has attachment
I've been wanting to render galaxy flythroughs in VR for ages, but Blender's 1000 image texture limit has been restrictive. Not any more - this has been removed in 2.79. Also there are encoding options which look to me to be giving much better results, though I can't vouch for how well this will translate to YouTube. Here's a fairly small proof of concept test with 5,000 galaxies, detected with (and with distance measurements) from the ALFALFA hydrogen survey. Optical images are from the Sloan Digital Sky survey.

This is designed for VR headsets/Google cardboard. I'm curious what people are using to view them as my own headset (so far as I can tell) cannot get YouTube to play 3D 360 VR correctly, so I just view the original video file instead.

I do these periodically whenever ALFALFA updates their catalogue. The first one, the 30% catalogue, had about 11,000 galaxies (https://www.youtube.com/watch?v=6417rSsTeBQ&t=3s). The second (70%) had about 22,000 (https://www.youtube.com/watch?v=_t7Pbl-YU4k). The final catalogue has 31,000. It should be entirely feasible to render that, the galaxy images use a surprisingly small amount of memory.

More explanations of the renders here : http://astrorhysy.blogspot.com/2013/03/galaxies-are-pretty.html
Add a comment...

Post has attachment
A flight through VLA data of Milky Way hydrogen. This is an attempt to show a solid surface changing imperceptibly into a diffuse volume. There are 240 planes, each with a different slice of the data, with transparency controlled by colour. Each has a build modifer that randomly increases the rendered faces, with staggered start times increasing outwards from the two initial meshes. The jumping in colour at the end is unintentional, but I kinda like it.
Add a comment...

Post has attachment
Someone on YouTube asked for a side-by-side 3D version of the ALFALFA sky. So, here it is. You can view it with 3D TVs, headsets, projectors and stuff I guess. Or visually if you can see stereo pairs.

This is the same version as previously, using the ALFALFA 70% complete catalogue (22,000 galaxies). The 100% catalogue (31,000 galaxies) is now online , so that will be my next little project once I finish with the abstract stuff. Ideally that will be in 360 3D if I can figure out how to get around Blender's silly Cycles image texture limit (or maybe it's already fixed...).
Add a comment...

Post has attachment
This one uses all-sky HI data to displace a sphere as well as influence its colour.
Animated Photo
Add a comment...

Post has attachment
This one exploits normally annoying rendering artifacts. Blender's realtime display can only show objects from a single side, so for nested transparent spheres you can normally only see their interior or exterior, not both. But by clipping out all the lowest transparency values, part of the other side is revealed. Couple this with a wide-angle lens and changing viewpoint and get quite a lovely mess.
Animated Photo
Add a comment...

Post has attachment
Animated Photo
Add a comment...
Wait while more posts are being loaded