Profile cover photo
Profile photo
Andrew Bortz

Post has attachment
Solo backpacking in Point Reyes National Seashore. First time camping in years!

Post has attachment
I've finally mounted the San Francisco panorama! :-D

Post has attachment
So, after a long hiatus, I'm back with more astrophotography!

Here is a photo of Andromeda (again, I know!), this time with 42 minutes of light time from a remote observatory. While this is about half my previous attempt (1 hr 17 mins), it's with a lens twice the aperture, and therefore 4x the light gathering capability -- and it shows. The resolution and noise of this latest attempt is far better than in the past, and the total cost (well, if I wasn't still using free "trial" minutes) would only be $40, around the cost of renting a nice lens. And I didn't even have to go out in the cold. Although that part can be pretty fun too…

This album documents the evolution of my attempts, from newest to oldest. You decide. :-)

Awesome technical details:

This data was actually collected almost a year ago, but, as I described in a previous post, my software processing pipeline had reached its limits. As background, typical high-end astrophotography systems have monochromatic sensors, and use filters to capture different color channels. Here, I decided to experiment with filters typically used for stellar research (namely, Johnson filters V and B), along with a visual passband filter (weirdly called "Luminance"), and see if I could construct true color images from that data. This was partly motivated by the fact that this particular remote telescope didn't have "standard" color filters. :-)

Also, the "standard" color astrophotography capture plan involves capturing three channels worth of color data at half resolution ("R", "G", and "B", and I use quotes because these filters barely correspond to sRGB primaries), and a "Luminance" channel at full resolution, totally 4 channels of data. I decided this was silly and redundant, and decided to experiment with capturing only 3 channels of data: a "Luminance" channel at full resolution, and V and B channels at half resolution.

All of this is pretty much heresy in the community, but I'm incredibly satisfied with both experiments. To be fair, this required a decent amount of computational machinery, but I can (and will!) share it. The trickiest piece is computing a color transform from LVB to RGB. I did this by taking the known spectra of the filters and CCD used, simulating the response to a standard set of color patches (namely, the ColorChecker colors), and computing an optimized matrix transform. The fit isn't perfect, but it's definitely good enough, and better than some well-known digital cameras that will not be named... ;-)

Solving the second piece -- using the full-resolution data appropriately -- involved first computing the colors for the half-resolution 3-channel dataset, then "modulating" the brightness of these pixels using the full-resolution "luminance". Far easier than the first piece, but still nearly impossible with existing image transformation tools.

The real MVP of this whole adventure is NumPy (and SciPy, for two necessary optimization sub-problems). Also, I believe this may actually have value in the broader research AND astrophotographic communities: if anyone knows anyone with large datasets of CCD data with research filters, I'd love to talk with them!

Anyway, while I initially bit off way more than I could chew: I now give you Andromeda, 42 minutes of light from 150mm effective clear aperture, in what I believe is the first real attempt at LVB imaging, much less with my optimized full resolution capture plan.

Adventures with Andromeda
3 Photos - View album

Post has shared content

Post has shared content
Your Blink Questions Answered

Over the last day, we've collected hundreds of questions and votes from the developer community on Google Moderator ( that seek answers to your tough questions about Blink, Chromium's rendering engine.

+Paul Irish  sat down with Chrome's Web Platform PM +Alex Komoroske  and Blink engineering leads Darin Fisher and Eric Seidel to get some answers. 

The video ( is embedded below.

Below are all the top-voted questions, along with timecodes you can click:

1:12 What will be the relationship between the WebKit and Blink codebases going forward? 

2:42 When will Blink ship on the Chrome channels Canary/Beta/Stable?

3:25 How does the plan for transitioning the WebKit integrated in Android to Blink look like?

4:59 Can you elaborate on the idea of moving the DOM into JavaScript?
6:40 Can you elaborate on the idea of "removing obscure parts of the DOM and make backwards incompatible changesthat benefit performance or remove complexity"?

8:35 How will Blink responsibly deprecate prefixed CSS properties?

9:30 What will prevent the same collaborative development difficulties that have hampered Webkit emerging in Blink, as it gains more contributors and is ported to more platforms?

12:35 Will changes to Blink be contributed back to the WebKit project?

13:34 Google said problems living with the WebKit2 multi-process model was a prime reason to create Blink, but Apple engineers say they asked to integrate Chromium's multi-process into WebKit prior to creating WebKit2, and were refused. What gives?
16:46 Is the plan to shift Android's <webview> implementation over to Blink as well?

17:26 Will blink be able to support multiple scripting languages? E.g. Dart.

19:34 How will affect other browsers that have adopted WebKit?

20:44 Does this means Google stops contributions to WebKit?

21:31 What Open Source license will Blink have? Will it continue to support the H.264 video codec?
22:11 Any user-agent string changes? 

23:38 When we'll be able to test first versions of Blink in Chromium? 
24:15 How can developers follow Blink's development?

25:40 What is about?
26:40 How will this impact Dart language's progress? 
27:13 Will this be a direct competitor against Mozilla's new engine? 

29:03 When will all existing vendor prefixes in Blink be phased out?

30:20 Will you support -blink-text-decoration: blink? ;)

Post has attachment

Post has attachment
My desk is finally complete! :-D

Post has attachment
"One of the requests there was to provide some sort of flow chart on how to do machine learning. As this is clearly impossible, I went to work straight away." Interesting.

Post has attachment
Posted without comment

Post has attachment
The Giants World Series trophy was at the office today. Why?! Who knows!! Here's me looking silly with it. Sadly we were told we couldn't touch it, or else I'd have been holding it over my head. ;-)
Wait while more posts are being loaded