I published an asynchronous streaming JSON parser for node.js (npm install stream-json). How is it different from the rest?
1. It is small. You will not find files with thousand lines of code. In fact all files smaller than 200 lines.
2. It literally streams everything, including individual values, which are frequently assumed to be small enough to fit in the memory. Well, after I saw a 1T (that's a terabyte!) string at my client's, I knew I have to do something about it. The upside is a small memory footprint no matter what, which is always good.
The API is inspired by SAX - a simple event-based model is used to communicate with user's code.
Technically, it is a collection of node.js 0.10 stream components, which can be combined together with pipes to produce a workflow suitable for any custom task. Everything is documented, so users can add their own components to the mix.
A score of small components are provided, including a filter, which can selectively pass/suppress sub-objects effectively making a subset. The filter can operate using a regular expression, or a function.