Profile cover photo
Profile photo
Y. Thong Kuah
235 followers -
#ruby, #geek, #learning
#ruby, #geek, #learning

235 followers
About
Y. Thong's posts

Post has shared content
Trade Route, amen
Securing our Digital Trade Routes

We weren’t successful in getting Pacific Fibre away. Sorry. We tried. 
The killer blow was not having significant local funding - it’s too risky for fund managers to invest in a greenfields infrastructure. 

While we were progressed on a number of term sheets with offshore funding providers, we ran into significant market and political issues around connecting countries. These issues proved too difficult for a small private company to solve.

At least there has been a good discussion from the project fallout.  There can’t be any doubt of how important the internet is to New Zealanders.


Capacity is not the issue

We were disappointed to hear from Government that there is ‘not a capacity issue’. We believe that Southern Cross has not unlocked their capacity directly points to the problem of international broadband.

Southern Cross rationally are motivated to maximize their returns.  They are motivated to keep capacity scarce, pricing accordingly. 

If Southern Cross halved the price of International Bandwidth we’d likely use 4 times as much. While their revenue would likely go up or at worst stay the same, they would be forced to unlock more capacity and get closer to exhausting their limited supply. Continuing on that path they would eventually face a forced upgrade and major capital expenditure.

The problem is, a 21% average annual reduction in pricing might sound good, or even pegging pricing to Australia, but it doesn’t mean much when usage could otherwise be quadrupling each year. 

In the Pacific Fibre business plan we based our plans on users spending the same money but getting vastly more capacity. 

Substantially lower pricing allows businesses to do real-time conferencing with offshore teams and allows use of low price commodity tools like Google Hangouts. US teenagers, tomorrow's global workers, are already being trained for a broadband abundant mentality and we need to be there.
A second cable is required to address this market failure and unlock capacity.

 
Demand side versus supply side

International broadband is especially interesting because usage is supply driven.

It appears that Government thinking is that a demand side model works.  That is, consumers will get fibre to the home and want more and more services which will create more demand for international bandwidth and the market will respond.

The use of International broadband is supply side driven.  

If you were on dial up you would not download a movie.  Once you have broadband people start downloading movies. Availability creates usage. Doubling the availability of internet supply would see it quickly used up.
If you use skype, and get good broadband you start using video.  If that works you use HD video.  Then multiparty video. Usage is undoubtedly supply driven.

Providing affordable international broadband will see usage increase substantially and see companies using the technology to improve their businesses and grow exports.


Impact on the UFB

We see International Bandwidth intimately connected to UFB. There are few advantages to New Zealand exporters in having fast access to their next door neighbour, compared to the huge advantages of being able to have rich multiparty interaction with offshore customers and partners.

Simply the New Zealand UFB program is building a fast intranet.
It’s been great to see the Government spending money on Internet infrastructure. However, it is a complex interrelated system. Fibre to the Home is an expensive investment and without other parts of the system sorted it could become a white elephant. 

We believe that at a household level, there is no reason to connect to UFB.
This seems to be bourn out on the low adoption numbers, an early sign of the lack of demand. This confirms the lack of compelling reasons for households to connect to UFB. They’ll just use their data caps faster. With the bulk of content offshore FTTH is not a materially better experience than good copper/DSL service.

Already new Retail ISP’s are reporting almost no demand as the directly target the first suburbs following Chorus’s fibre deployment.


UFB and Content 

Consumers spend today $70pm on Sky TV.  They will not spend an additional $70 on broadband.  There is a maximum household spend on information services.  Perhaps $99 for internet, phone and content.
In New Zealand there are no compelling content services that can be delivered over IP. 

This is a fundamental blocker. Either Sky needs to be regulated or global content sources like iTunes, Netflix, Hulu or Amazon Prime need to be unblocked.

UFB cannot possibly be successful without resolving the content issue to create residential demand.


Making the boat go faster

The reason we invested so much time and personal capital in Pacific Fibre is we truly believe transforming our international connectivity is one of the real ways to provide a step change opportunity for New Zealand. Look at our entrepreneurs. They repeatedly show we understand the connected business world very well. The sale of Wildfire to Google for a reported $US400m shows how just a few smart New Zealanders can make a real difference with this vital new infrastructure.

While the Government is committed to FTTH we don’t believe it is necessary.  

We may see some innovation from small teams and FTTH does allow NZ business owners to more conveniently communicate with customers and partners and global teams in time zone without having to drive to the office in the middle of the night.  But it is a huge investment where motivated entrepreneurs will already have sorted out how to be connected. 
For businesses using fibre, local ISPs have told me the only application gaining interest is offsite backups - which do work in our Intranet model. This does not improve exports materially.

The sort of applications that increase exports at scale are online multi-party meetings. This real time content cannot be cached and is directly affected by the availability of International broadband.

American students, who have no concept of international broadband (as they are on the same continent as the data centres) are already used to frictionless multiparty video conferencing with Skype and Google Hangouts. How they will do business is fundamentally different to how we can work in New Zealand. 


Industry dynamics

Since closing Pacific Fibre we have become aware that there are opportunities to work together with Southern Cross, Telecom Retail and Chorus to engineer a  better longer term internation business model for New Zealand.

Perhaps there is a way to transition the existing cable system from an equity business to an infrastructure business in such a way that all parties win.

It would be very useful to get all parties in a room to discuss.


A solution

From our experience we are reasonably sure a private, carrier independent, company is unlikely to get a $400m undersea cable project underway. But we believe there are other ways to do it, and this requires joined up thinking. 
To kick off, there is a crucial question: Do the people of New Zealand want to own their own cable? The feedback we received as Pacific Fibre is that they do. This should be tested. If you see the opportunity as clearly as we do there is an elegant way for us to step change New Zealand connectivity and save UFB.

We believe the Government should urgently consider whether building a Zealand cable as a Public Private Sector Partnership (PPP) is worth consideration.

Key criteria should include:

● The Government investment should be minimal and be able to be extracted. We’re very conscious of the fiscal constraints but a small investment can provide the opportunity for massive growth

● It should be fair on Southern Cross, the incumbent provider.

How it could work in 10 steps:

1. The Government forms a PPP to build a new Cable between New Zealand and Australia.  It is a for profit company operating commercially but in its charter it strives to make a reasonable return on capital and unlock benefits for all of New Zealand.

2. The Cable connects from Australia through New Zealand to the USA as it will provide service to the Australian market, which is 5 times larger than New Zealand.

3. The Cable has spurs connecting Pacific Island countries. This allows New Zealand to take a leadership role in the Pacific and attracts additional development funds from those countries.

4. The Government invests the first $100m in the venture, earning equity returns. Having the Government in first will make it easier for other investors, such as our sovereign wealth funds, to invest in the project in a mixture of debt and equity. That money could be diverted from the $1.5B already allocated to UFB. We’re sure the industry wouldn’t mind if they get a new cable.

This will allow the TE-Subcom vendor contract to be reestablished so they can commence construction as quickly as possible.  It is useful to the permitting of the project that TE-Subcom is a US vendor.

We learned that presales before financing are not essential. For physical diversity reasons customers will have to buy on the cable and the favorable unit economics and lower latency will see a new cable attract a better than fair share of future contracts.

5. Crown Fibre Holdings, in order to ensure UFB is successful, puts out a tender to supply international broadband on a per connection basis to complete a full wholesale rate card. This means any connection to UFB will get exceptional international Internet, creating a reason for consumers and business to connect and saving that $1.5B investment.

It is likely Southern Cross as well as the new Cable will jointly win that tender. Southern Cross’s revenue will probably go up, but they will have to release a lot more bandwidth, therefore filling their capacity. They are not disadvantaged.

6. New Retail Service Providers will no longer have to buy international bandwidth from Telecom, their largest competitor, and can now cost effectively provide new services to customers with a full internet rate card allowing innovation in new services to New Zealand businesses and consumers.

7. The new Cable company actively sells into the Australian Market, ensuring it delivers good returns to the Crown and all other investors.

New Zealand has over 2m mobile phones and 1m residential broadband connections.  Even if only 1m of those connect over the new system at $10 per month, that is $120m of annual revenue, which covers a $400m investment. And that is not including Australian revenue. Adding $10 to complete the UFB wholesale rate card with international would be a huge stimulation to the local telecommunications industry.

8. The Crown may then seek to divest their investment once the cable system is operational.

9. In order to ensure homes have a reason to connect to UFB the Broadcasting Minister is tasked with unblocking alternative entertainment content sources into New Zealand such as US iTunes and Netflix. TVNZ content, once broadcast for free might also be loaded into these overseas content sources at a low per use cost allowing us to export creative content globally.

10. Once established the venture may even be floated on the public markets to allow New Zealanders to invest directly outside of their Superfund investment.

We’ve shown a new cable cannot be done as private company. This project is very doable with minimal exposure for the Government, minimal cost to the taxpayer, and with the potential to avoid damage to the property rights of Southern Cross. Telecom, the 50% owner of Southern Cross, remains best positioned to exploit the new bandwidth a new cable would provide.

New Zealanders are smart. We’ll use this capability in ways we haven’t even thought of yet. The world loves us. If we can secure this vital digital trade route we can provide a step change for New Zealand.

I am more than happy to assist on this issue in anyway I can. I truly believe putting New Zealand on the global network is a game changer.

Post has attachment

Anyone still using Google + ?

Post has shared content
Wonderful, hopeful, new update from +Amit Gupta:


"… After over 100 drives organized by friends, family, and strangers, celebrity call-outs, a bazillion reblogs (7000+!), tweets, and Facebook posts, press, fundraising and international drives organized by tireless friends, and a couple painful false starts, I’ve got a 10/10 matched donor!

You all literally helped save my life. (And the lives of many others.)"

Post has shared content
Good bye, Google Maps… thanks for all the fish

TL;DR: We at StreetEasy decided to build our own maps using, among other tools, OpenStreetMap, TileMill, MapBox and Leaflet, instead of paying hundreds of thousands of dollars per year to Google. And yes, the money pushed us into doing it, but we're happier with the result because we now control the contents of our maps.

We were all happy...
Our site, StreetEasy (http://streeteasy.com/), has been using Google Maps embedded in our pages for the last 6 years. We're a real estate portal, so most of our pages have maps in them. So when Google announced they new usage limits (see http://www.dailymail.co.uk/sciencetech/article-2056128/Google-Maps-start-charging--thousands-sites-apps-hit-fees.html) , we were a little worried.

25,000 free map views per day, and $4 per CPM (1,000 views) beyond that. On Christmas day, when everybody was opening their presents, we did ten times that. On a good day, we do 600K-700K pageviews (http://www.quantcast.com/streeteasy.com).

We did the math and came up with numbers that reminded me of Oracle licensing in 1999. Six, seven, eight hundred thousand dollars. We met with Google salespeople, expecting to negotiate better terms, and they were nice, and they offered us discounts, but only to about half of what we've calculated.

In our opinion, their price was off by an order of magnitude. It's very, very hard to work out a $2 CPM cost in any site's business model, when most of the time, if you're lucky, you're making $1 CPM off your pages. And yes, StreetEasy does much better than that, and it would not have bankrupted us, but it would have also meant giving away a significant chunk of our profits.

It was not just the money!
$200,000 to $300,000 a year is, at the very least, the same as hiring a very good engineer for a year (and paying all the taxes and benefits and costs and still having a lot of money left). It was enough money to finally push us into doing our own maps.

Because despite Google Maps being such an awesome product, it had it's downsides. One is that your site looks just like every other site with maps on the Internet (and I know you can customize their colors now, but that costs even more!). Another is that you have no control over your maps, so when you're trying to point out the location of this wonderful apartment, Google might thing it's a good idea to cutter the map with random local businesses (and yes, they've gotten better at it, but often it's just noise). Or they might have bad data, and there's very little you can do about it except report it and wait. (I've always been annoyed at "Classon Pointe" being shown in the middle of Harlem, probably a mistake by some mapping data company decades ago, again, something that has been corrected, but that highlights the problem)

I've always wanted to have our own maps, but thought it would be impossible, or at the very least, a huge amount of work. Something not worth considering, given the rest of a long list of things we also wanted to build on StreetEasy. But with a potential invoice for a third of a million dollars hanging over our heads, we had enough "carrot" (or is it "stick"?) to revisit our priorities. At the very least, we should do more research and see what our options were.

Looking beyond GMaps
Our first option was, of course, Bing Maps. I'm sure Microsoft is having a great time helping all the Google Maps Refugees, and I have no doubt they would have offered us a very cheap licensing deal, but it still meant using someone else's maps, and leave us with license renegotiation risks a year or two down the road. But it was an option.

Then, my coworker +Jordan Anderson, sitting quietly across my desk, pointed out that his "other job", the site he had built with a friend before joining StreetEasy, the fabulous Ride The City (http://ridethecity.com/), did not use Google Maps, but their own tiles, and an open source JS library to display them.

A couple of days later, at a NYC Big Apps hackathon where we were showing off our public APIs, I met +Javier de la Torre (from http://vizzuality.com) and he showed me his awesome product, CartoDB (http://cartodb.com) and gave me a few more pointers. And I saw what +Alastair Coote was doing for his taxi app and got excited with the possibilities.

I spent the next week reading and browsing and searching, discovering the wonderful world of digital cartography, and being amazed at how far the open source tools had advanced in the last few years.

The world of Open Source Cartography
We now had a great tile renderer, Mapnik (http://mapnik.org/), that was at the core of pretty much every mapping tool out there. Great "geo" and "gis" functionality for Postgres, in the form of PostGIS (http://postgis.refractions.net/). A few javascript libraries to present the results inside web browsers, such as Leaflet (http://leaflet.cloudmade.com/), Open Layers (http://openlayers.org/) and Modest Maps (http://modestmaps.com/), and other libraries to abstract your mapping backend behind a common API, such as Wax (http://mapbox.com/wax/) or Mapstraction (http://mapstraction.com/).

But then I discovered the "second generation" of tools, built on top of what I just listed on the previous paragraph, and it blew my mind. Things like CartoDB or TileMill (http://mapbox.com/tilemill/) or Web Map Studio (http://cloudmade.com/products/web-maps-studio).

TileMill, in particular, was just amazing, and Carto CSS (http://developmentseed.org/blog/2011/feb/09/introducing-carto-css-map-styling-language/) made map design look like something I could actually do!

And of course, OpenStreetMap (http://www.openstreetmap.org/), the Wikipedia of mapping. An open source (well, technically, Creative Commons) data set, covering the entire globe, with lots of details (sometimes too much detail, like the voltage and gauge of a subway line!). It has a few errors here and there, but you can go and fix them yourself (as I've done http://www.openstreetmap.org/user/sdelmont/edits).

The path we took
I settled on Leaflet for the front end, mostly because it was small, fast, clean code with a good API that resembled Google Maps v2. It's a good thing that when we first implemented maps on StreetEasy, we did it through ruby that generated the JS code, so all I had to do was "implement an new backend". If I were to do it today, I might use Wax or Mapstraction instead, to ensure I could change map APIs if I had to.

It was fairly easy to implement most basic features. Showing a map, adding markers, adding polygons, info popups (we had our own code for that, just had to hook it on the right events). I spent a couple of days getting our "polygon editor" to work (something I plan to contribute back to Leaflet as soon as I have time to clean up the code). And of course, the dreaded "does it run on IE?" time (I ran into some issues with onload events on script tags, but that was all).

I installed Postgres and PostGIS, downloaded OSM extracts from http://download.geofabrik.de/osm/north-america/, because there is no point in downloading gigs and gigs of worldwide data when all I care about is the area around NYC. Imported it using osm2pgsql (http://wiki.openstreetmap.org/wiki/Osm2pgsql) and started playing with TileMill.

I discovered the work of Mike at Stamen (for example http://mike.teczno.com/notes/osm-us-terrain-layer.html) and was inspired by it. Found a couple of TileMill projects (https://github.com/mapbox/open-streets-style and https://github.com/mapbox/osm-bright) to better understand how to organize them. Ran into High Road (http://mike.teczno.com/notes/high-road.html), a set of queries that makes OSM roads much more manageable and stylable.

And I spent days and days tweaking maps. Just to get to a point where we were not unhappy with our maps. Something that was good enough to flip the switch.

We added building outlines from NYC Open Data (http://nycopendata.socrata.com/), and our own neighborhood boundaries to decide where to put the labels (and trust me, we have the best boundaries for NYC).

As soon as I had something that didn't cause my coworkers to vomit, I uploaded the tiles to S3 and started testing it on our site. A few days later, and a lot more map tweaks, we started using the new maps for some of our users. And as of Jan 10th, we flipped the switch for all pageviews on our site.

We decided to host our tileset with MapBox (http://mapbox.com), from the great guys at Development Seed.. We could have unpack the mbtiles file produced by TileMill and just upload them to S3 (see http://karchner.com/2011/02/21/extract-images-from-an-mbtiles-file-or-getting-actual/), but we went ahead and paid for MapBox, in part because it means less servers to worry about, in part because we want to support the guys that brought us TileMill, and in part because of the promise of more cool features down the road. And most importantly, because they promised to help us make our maps look nicer, and they know about nice maps.

Take a look at the results: http://streeteasy.com/nyc/sales/midtown-all-manhattan/status:open%7Cbeds:2?map_all=1

Where to now?
If I haven't made it clear, we're not completely happy with how our maps look, but we were happy enough to go ahead. We want to make them look great, with more data (such as subway stations) and better labels and lots of other little things. Development Seed will help us with that, and we've been learning a lot ourselves.

We'd also like to have a "live mapnik server", producing tiles on demand (and caching the results, duh) to make it easier to tweak our maps. Right now it takes a couple of days to go from OSM import to tile rendering to uploading multi-gigabyte files and finally showing them on the site. A live server would let us change a stylesheet and see the results right away.

We will try to contribute back to all these open source projects as much as we can. I already have some code for Leaflet for polygon editing and encoding, for example, and we've started doing edits on OSM.

What about geocoding?
You'd probably noticed I didn't talk about geocoding (the "art" of converting a street address into a set of coordinates in a map, in case you didn't know). That's part of what Google offers as part of their Maps APIs.

Well, at StreetEasy we built our own geocoder for NYC, using the City's database of streets and buildings. So it's not something we had to worry about as part of this transition.

But in case you need to do some geocoding, there are plenty of tools (for example http://highearthorbit.com/geocommons-open-sourced-geocoder/) that use OSM data.

The Year of the Open Map
I think that someone at Google got their pricing wrong by an order of magnitude. Large companies might be willing to pay that kind of licenses, but this is not the CMS market in 1998, where people would pay half a million for a Vignette license and another million for Oracle. There are so many open source options out there that the value of proprietary solutions has come down dramatically.

And if Google keeps pushing companies into experimenting with these open source solutions, it's only going to get better. I think 2012 is going to be the year of the Open Map. And I'm happy to be part of the front lines.

Post has shared content
Disqus data shows pseudonymous comments are best

Via the AP's +Jonathan Stray, who noted on Twitter that "Disqus shows that psuedonymous comments are "higher quality," as measured by likes and replies. Major data point."

I agree. Thanks to +Mariam Cook for highlighting the data on her blog.

And I imagine it's one that will be raised to +Vic Gundotra, although last I heard (at Web 2.0 Summit) pseudonyms were supposed to be coming to Google+ soon.

Post has shared content
A Status Update on SOPA from Washington

A colleague just asked me for a crash course on the Stop Online Privacy Act. I sent them my feature:
http://radar.oreilly.com/2011/11/sopa-protectip.html

The thing is, that post is about 6,000 words long and is now a month out of date. So here's the briefing I sent back. First, the major players in the House:

Rep. Lamar Smith, chairman of House Judiciary Committee. His staffers had a major hand in drafting it. He supports it. So do Reps. Goodlatte and Berman. Rep. Mel Watts is the congressman whose remarks about not understanding helped to fuel headlines about people in the House making laws about something (the Internet) they don't understand (which is something that makes people who use and do understand it VERY frustrated).

Dear Congress, It's No Longer OK To Not Know How The Internet Works:
http://motherboard.vice.com/2011/12/16/dear-congress-it-s-no-longer-ok-to-not-know-how-the-internet-works *"Dear Internet: It's No Longer OK to Not Know How Congress Works"*-+Clay Johnson:
http://www.informationdiet.com/blog/read/dear-internet-its-no-longer-ok-to-not-know-how-congress-works-

FOR SOPA: RIAA, MPAA, big Hollywood, labor. Ergo, a bipartisan coalition of 39 co-sponsors in the House. Oh, and all of these companies:
http://gizmodo.com/5870241

AGAINST SOPA: Reps. Darrell Issa, Zoe Lofgren, Jason Chaffetz and Jared Polis, the Internet industry. These four representatives (2 from CA, 1 from Colorado, 1 from Utah) introduced dozens of amendments to SOPA that would have addressed the most damaging, controversial, vague or problematic aspects of the bill, post-manager's amendment. (There's a lot of those.) By raising them, they created two day's worth of debate during the markup, effectively filibustering SOPA's progress during the waning days of the legislative calendar. They essentially ran out the clock, to apply a football metaphor, on the year at a time when the rest of the House was focused on other issues. See: payroll tax cut extension.

Rep. Michelle Bachmann is the only GOP candidate I've heard talk about it, which is notable. I think there should have been a debate question about it and the Internet -- but those aren't up to me.

Key counterproposal: An "*OPEN*" bill from Issa and other opponents of SOPA. Learn more at http://keepthewebopen.com … there's a lot that's interesting about that site, including both bill posts with commenting. It hosted an embedded livestream of the markup hearings.

Prospects: mixed. On the one hand, it's looking likely that it will pass out of committee. Proposed amendments voted down 2-1 in HJC when the manager's amendment was marked up. Unless something changes, I expect SOPA to emerge largely unamended, particularly with respect to that relates search engines and use of DNS for enforcement, the most controversial aspects of the bill for the tech community.

On the other hand, there have been significant cybersecurity concerns raised about the bills because of what it would do to DNSSEC, including by DHS officials. The committee might take a classified briefing so that the government's own geeks from Sandia Labs and DHS and other "Three Letter Agencies" could explain to the legislators) who somehow neglected to bring in any technical experts before the committee to testify) why SOPA won't work and why it's a terrible idea to try to DNS for enforcement. If that happens before markup, it could change the bill that heads to the House floor -- and House leadership might want to address security concerns before bringing it to a full vote.

There's going to be a month when the senators will be hearing about how unpopular these bills are. It's unclear if public option will turn enough against them if the broadcast and cable TV networks (which are all FOR SOPA) don't cover it. Fox News just did a spot, so that may be changing. I'd love to be a fly on the wall between network executives and producers at CNN, 60 Minutes and MSBNC right now.

That said, it's not 1982. The Internet will drive awareness of these bills in 2012 in a way that simply wasn't possible before this moment in history. The reaction from tech companies and their leaders is in of itself news and it's much harder to miss the discussion around SOPA online now. Google, Facebook and Wikipedia still haven't changed their homepages to protest SOPA. While +Sergey Brin +Eric Mill and +Jimmy Wales have expressed concerns about the bill, as written, +Mark Zuckerberg has not written a "status update" like Brin yet about it. Those are 3 of the top 10 sites in the world and places that nearly 100% of online citizens hit daily. If Zuck or more Internet executives came out that publicly against SOPA, it would affect the debate in D.C.

People to follow to stay up to date on #SOPA: The single most prolific blogger has been Mike Masnick at +Techdirt , who has shifted much of his output to the issue over the past month. Masnick is ardently against the bill. I think +Declan McCullagh at +CNET and +Gautham Nagesh at the Hill have produced some of the the best sourced coverage around right now and understand both the politics and the technology (a regrettably rare combination). If you want to keep up to date and can afford to pay to get the news earlier, Politico's tech policy team is all over it at @politicopro(paid) and @morningtech. http://politico.com/morningtech.

If you like your analysis free and in real-time, follow
+Julian Sanchez (now at Cato), who has been following SOPA closely, +Nate Anderson at +Ars Technica and Cory @Doctorow at @BoingBoing. The @EFF and Center for Democracy and Technology have all been watching the progress and provisions of the bills on a daily basis, including livetweeting the hearings (@EFFLive).

Date of next markup: Unclear. Likely when the House comes back in session in 2012:
http://majorityleader.gov/calendar/112Congress2ndSession.pdf
Expect @DarrellIssa to share it on Twitter. He's been breaking a lot of the news on SOPA there.

Other key date: January 24th. That's when the Protect IP Act (PIPA) is set to go before the Senate. Senator Reid has said he's going to bring it up on the first day the Senate is back in session. Senator Ron Wyden (D-OR), who put a block on it, says he filibuster it. Key ratio, as with any bill there, is for/against in Senate. It will be interesting to see how other senators line up. That 60+ for or 40+ split is what to ask political analysts about -- I don't know that count yet
That's what I've got from Washington for now. When I know more, I'll share it onwards.

Post has attachment
Buzz is going away, tried the Google Takeout thing to get a back up of all my Buzz posts. Not much worth keeping. Here's one :

http://www.slideshare.net/kuahyeow/rails-3-cool-new-things

Post has attachment
Google News Re-design

I complained about the new Google News design and layout on Twitter -https://twitter.com/#!/kuahyeow/status/127553347732516864 . Subsequently I found that others share my reaction too. http://www.readwriteweb.com/archives/google_news_followup.php

Here are some specific problems in my opinion, where I hope the Google News team will fix.

Firstly, general impressions on the homepage. It is very cluttered, and an utter linkfest. The three-column layout with sidebars on the left and the right does not help either. The presence of at least 6 button-y icons near the top of the page presents far too much choice to the reader.

Spacing between news items and lines of text is a problem. The line spacing can be doubled to improve readability.

Clicking anywhere within an individual news section will expand the section, exposing related articles and videos. One major problem here is that clicking on the headline link will expand the section and open the link to the news page as well. This is very bad behaviour.

The left side-bar is an example of bad content. It is presented without any context, and appears to be a bunch of un-related text. Contrast this to the much better implementation of Twitter trends.

There are lots of un-necessary blue box highlighting and grey lines which further adds to the impression of clutter. The blue highlighting seems to serve no purpose other than to annoy, as the cursor pointer remains a text cursor - not a pointer; and clicking within the blue box seems to do nothing.

Currently, the only way for me to revert to the much better previous version is to the click the Cog icon, switch to two-column layout, and hide the right sidebar.


#design #ui #readability
PhotoPhotoPhotoPhoto
4 Photos - View album

Post has attachment
Here's a behind the scenes write-up for my Mix and Mash NZ mashup.

Check it out here :

http://www.landandwaste.co.nz

This mashup attempts to show land and waste data as it changes, and hopefully illuminates any improvements of our environmental impact as well as regressions.

The journey

It started out as an idea to gather all kerb recycling data from all 75 odd local councils in New Zealand. That meant scraping the data off the websites of various local and unitary councils. I quickly backed away once I found that lots of councils forbade any re-use, and the data that's available is too inconsistent for me to compare and visualize.

Therefore much time and effort was spent looking for suitable data. I was interested in mainly environmental type data, which was to found mainly on the Ministry for the Environment website. I was also interested in regional level data, instead of a single national figure, which wasn't that interesting .

In the end, I found suitable data for land use, rates, farm animal counts, population, and waste. Given more time, I would have hunted for transport data, which would then completely encompass all the major responsibilities of a regional council.

Putting it together

This mashup was made with Ruby on Rails, TileMill (http://mapbox.com/tilemill/), d3 (http://mbostock.github.com/d3/) and polymaps (http://polymaps.org/)

The excellent Koordinates.com put up a CC licensed shapefile of various land types in New Zealand. Instead of reaching for the de-facto choice of Google maps, this time I chose to try my hand at creating custom map tiles from that data. This had the advantage of being able to design the map to be far friendlier for statistical visualizations like cartograms, etc.

TileMill is simply a brilliant piece of software, which allows once to pull data from various sources and create beautiful maps, using any style you want. In addition, there are options for mouseover and click events.

There is an export function which will allow you to host the map tiles online ala Google Maps. I chose to utilise the hosted service at TileStream (http://mapbox.com/#/tilestream). It is completely possible to self-host as well, as the hosting code has been made available at https://github.com/mapbox/tilestream.

The final map I produced using TileMill can be seen at http://www.landandwaste.co.nz/land.

Importing and processing the tabular data seen on the mashup was just a matter of using a combination of Google Docs, Google Refine, and csv files. Choosing a suitable graphical representation for each set of data was slightly tougher. I used a combination of sparklines, quantile maps, and just plain text. This was done using d3 and polymaps.

Where to from here

It was a huge learning experience. I learnt that creating something useful is very possible, and takes less time than I expected. Creating something polished requires far more time. I will definitely find a designer or improve my own design skills next time.

#mixandmashnz
PhotoPhotoPhoto
3 Photos - View album
Wait while more posts are being loaded