Profile cover photo
Profile photo
Brendan Kenny
5,490 followers
5,490 followers
About
Brendan's posts

Post has attachment
Came across some friends in the woods. Things escalated quickly. #jsconf #visitflorida!
PhotoPhotoPhotoPhotoPhoto
snake fight
10 Photos - View album

Post has shared content
Crossfilter is a JS library designed to help you build multiple coordinated views of a single dataset, letting you see and interact with different aspects of that data simultaneously. There's no reason to limit the views we create to simple graphs, though.

In this screencast I give a (hopefully!) simple introduction to binding a Google Map to Crossfilter so that interaction with the map is mirrored in histograms bound to the same Crossfilter, and manipulations of the histograms affect the data displayed on the map.

There's a live demo and source code here: http://brendankenny.github.io/crossfilter-and-v3/
In our latest Google Maps Garage episode, +Brendan Kenny shows how to explore multi-variate geospatial data using Crossfilter and the Google Maps JavaScript API.

Crossfilter:
http://square.github.io/crossfilter/

The sample code from the show:
https://github.com/brendankenny/crossfilter-and-v3

Post has shared content
There was a sneak peek for this GDL's subject in a post on the Google Geo Developers Blog today: Map Dive demo

I tend not to share things lightly, but I'm dying to play this game. Learn about how it was built (and how you can build your own crazy Maps skydiving game using the web tech already in your possession) by tuning in tomorrow.
Join the Instrument and Google Maps Developer Relations teams to get an early demo of "Map Diving" at Google I/O. Instrument's developers will walk through how they built the installation using multiple instances of Chrome mashed up with the Google Maps Javascript API v3, Web GL, 3D CSS, web sockets and node.js.

Post has attachment
Interesting post from +Joseph Bonneau. I hadn't thought of it in this light before, but this is a great way to look at authentication. It sums up well what big providers are already doing using signals like IP address and usage patterns to determine suspicious log-ins.

http://www.lightbluetouchpaper.org/2012/12/14/authentication-is-machine-learning/

I don't entirely agree with one of the post's conclusions, however. While reliable authentication may become increasingly difficult for all but the biggest providers, super-reliable authentication isn't universally necessary. A simple password will suffice for my account on some small shopping site out there, as long as my financial transaction is protected by one of the big guns in authentication, since the little data stored by that site will presumably never be a valuable target for those willing to try to break into accounts.

Maybe I'm wrong (and stronger protections certainly aren't a bad thing on their own), but even some juicy info like randomperson's email address + demographic data doesn't cost very much to today's evildoers, and it's probably much cheaper to pay 5 cents a pop instead of going to the effort of nabbing someone's password or authentication cookie.

Super-smart authentication that catches 95% of fraudulent activity also isn't going to save you if you use 'password1234' for every single account you have online, so stop doing that :)

The post is also worth reading just for the link Joseph provides to this discussion of the burgeoning field of adversarial machine learning: http://blaine-nelson.com/research/pubs/Huang-Joseph-AISec-2011

Post has attachment
Step 1: Update your Firefox Nightly (make sure it's from today).
Step 2: Go visit +Eric Bidelman's transferable object demo:
http://html5-demos.appspot.com/static/workers/transferables/index.html
Step 3: Notice amazing speedup compared to your Stable Firefox.

Computation with web workers has been great, but when you have to return results that are themselves very large (think giant TypedArrays for everything from raw texture data to transforms for hundreds or thousands of objects), you have to pay with the time it takes to copy the data back to the main thread, and you may pay yet again when it's time for GC. If you want to do this every frame of a game or animation, you have had to stick to transferring fairly small objects to maintain a decent framerate.

Transferable objects change all that. On my machine, the demo takes ~150ms to transfer a 32MB buffer without transferable object support, but less than 5ms with them (and hopefully that number keeps going down).

It's very exciting to see support added to Firefox. This means we essentially have double the number of browsers where we can use them (well, when it gets to stable), and transferables become something we can rely on for the vast majority of browsers that can run WebGL content.

Post has attachment
"'It is not common in the life of the law in America for a lower court and a major segment of its bar to take on the nation's highest court, effectively reversing some major precedents or at least substantially mitigating their impact,' notes Steven Flanders in a recent history of the patent court. 'Yet this was done.'

The Federal Circuit, he said, also took on 'the quieter and subtler effort to re-educate trial judges throughout the judiciary, to make them friendlier to patent-holders (or at least to the system of patents) as well.' (Flanders, it should be noted, is an avowed supporter of the Federal Circuit and its efforts to reshape patent law)."

great earlier article on the invention of software patents by the Federal Circuit in the 90s here: http://arstechnica.com/tech-policy/2009/01/resurrecting-the-supreme-courts-software-patent-ban-not-ready/

Post has shared content
And hopefully we can add /* spec: es3 */ before too long.

eh, +Kenneth Russell? :D
Just added support for CSS Shaders in https://github.com/WebGLTools/GL-Shader-Validator - just put /* spec: css */ in your shader!

You can also put:
/* spec: webgl */ or
/* spec: es2 */

to validate against those specs as well.

And you can also set the default spec in Preferences > Package Settings > GL Shader Validator > Settings - Default

Post has attachment
The WebGL shader validator plugin for Sublime Text 2 that +Paul Lewis and I built is now in Package Control! If you have Package Control installed, just "Install Package" and look for "GL Shader Validator". Otherwise, you can still download the repo and install it as you would any other package. When you save a file, any errors will be highlighted (see below for current limitations).

If you have any problems, please file an issue on the github project so we can fix it.

How it works: The heavy lifting is done by the preprocessor and verifier from the ANGLE project. This does the validation itself, which means you get the exact same error checking you get in Chrome and Firefox, but without requiring a page refresh just to see if you made a syntax error or spelling mistake (and refreshes can be super annoying when also loading a bunch of WebGL assets).

The great thing about using ANGLE for validation is that it will make sure your shader is fully conformant with the WebGL spec* (which has requirements beyond just valid GLSL) and there are several groups -- including two major browser vendors -- that are very motivated to fix bugs and ensure it remains conformant.

Current limitations: file names have to end with a '.vert' or '.frag' file extension, and you have to have some form of GLSL syntax highlighting enabled for the file within Sublime (the "OpenGL Shading Language" package in Package Control works well). We recognize that many people don't organize their shader files like this: some use inline strings, some concatenate their vert and frag shaders in a single file (with shared common code). We want this to be a useful plugin, so we're eager to expand support for other shader format preferences. Please file an issue on the project with specifics about your approach (patches are even more welcome :) and we'll see what we can do.

[*] ANGLE actually supports several different GLSL-variant specs, including regular GLSL ES and the proposed variant for CSS Shaders. See the package's User settings for how to select other specs to validate against.

Post has attachment
Fun night, but I guess I won't be going to sleep yet. Just remembered this came out today. 8 long years!

So far I've only heard it looks pretty good "for a Source game." No word yet on how the gameplay has fared.

If you have no idea what I'm talking about: http://en.wikipedia.org/wiki/Black_Mesa_(video_game)

We've been waiting for a really, really long time. Right man in the wrong place, etc.

Post has attachment
IonMonkey is now the JavaScript engine in Firefox 18 (currently the nightly version of Firefox). Benchmark party tonight! I'm really interested to see what speeds up.

Also, sweet new JavaScript Engine blog from Mozilla.
Wait while more posts are being loaded