Profile cover photo
Profile photo
Colin Alworth
438 followers
438 followers
About
Posts

GWT 2.8.2 is here! Check out the release notes at http://www.gwtproject.org/release-notes.html#Release_Notes_2_8_2.

Some notes:
* GWT can now run on Java 9 (though Java 9 features are not yet supported, coming soon!)
* Chrome 61 change in getAbsoluteTop/Left has been fixed
* Errors on the page reported from window.onerror are now reported to your uncaught exception handler
* GWT now generates CSP compliant dom elements

Post has shared content
See you there!
Add a comment...

GWT 2.8.1 is here!

Please check out the full release notes at http://www.gwtproject.org/release-notes.html#Release_Notes_2_8_1, then download the release zip (https://goo.gl/TysXZl) or update your project to get version 2.8.1 from Maven Central.

Thanks to all of our users who have reported issues or concerns, our contributors who keep discussing and improving GWT, and our release team who double checks the work before we release it!

Post has shared content
Add a comment...

Post has attachment

Post has shared content
New GWT logo is here! Check out our updated webpage, GWTproject.org
Add a comment...

Post has attachment
No, it doesn't start wednesday, but you better get into town before Thursday, because we're starting up early!

Earlyish. 8:30am, pacific time. Early for some people.

Okay, not too early.

Shaddup and get going.
Photo
Add a comment...

Post has attachment
http://engineering.shapesecurity.com/2015/01/detecting-phantomjs-based-visitors.html?m=1

+Ariya Hidayat is this something you consider to be a flaw in PhantomJS, and it would be preferable to let Phantom act exactly like a chromium build and no more, or is this a feature, and blocking one scraper should block them all?

More discussion at http://www.reddit.com/r/webdev/comments/2srbka/detecting_phantomjsbased_visitors
Add a comment...

Post has attachment
Wrote up a long reply to an article that came up on /r/TrueReddit, and it had been deleted by the author by the time I finished. Dumping it here instead, I'm not sure if I was expecting the original submitter to argue, or just to have other people hit me over the head. Its a bit longwinded and whiney, and now the internet gets to see it, even if no one reads it.

http://www.theatlantic.com/technology/archive/2015/01/the-cathedral-of-computation/384300/

Hmm. I think 'algorithm' might be confused with 'abstraction', and if the point is that people use them interchangably, that might be true to a degree, but they really mean different things. First though, nitpicking about the title, and abuse of 'what I want words to mean':

> Science and technology have become so pervasive and distorted, they have turned into a new type of theology.

Theology being a study of religion, belief and god. To go a bit quote-happy myself, Geertz defines religion as "a system of symbols which acts to establish powerful, pervasive and long-lasting moods and motivations in men by formulating conceptions of a general order of existence and clothing these conceptions with such an aura of factuality that the moods and motivations seem uniquely realistic." I don't plan on breaking that down here, but I'll accept that our author is not talking about the same religion that the rest of us our, but is instead suggesting that computers are a sort of god, at least in the eyes of some. But really, science has been perverted into merely a study of computers-as-god?

Okay, I'll give the clickbait title a rest. Moving on.

> Like metaphors, algorithms are simplifications, or distortions. They are caricatures. They take a complex system from the world and abstract it into processes that capture some of that system’s logic and discard others.

Algorithms are not caricatures, they are do not model systems, they do not capture logic. That model is an abstraction, but the algorithm is not the abstraction. The algorithm is what you can do with the abstraction. It should be a surprise to no one that a bad abstraction may not allow you do ever create a constructive set of steps to solve a problem.

If I'm growing tomatoes in my garden, and I count the tomatoes I get each year, I'm abstracting my total yield into a number by merely counting. If I instead choose to weight the total yearly output, I reduce it to a different number from which I can draw a different conclusion, one which might make it clearer that I had a few huge tomatoes, not a giant crop of tiny ones. Neither of these ideas of my garden's yield captures the idea of 'well, they were infested with bugs and mostly inedible' or 'I picked and served them before they could ripen on the vine or in storage', or 'everything went well and they were delicious'. No, I used measurements to produce an abstract concept of yield so that I could try to reproduce my good results, or improve them next year. That process is where we begin to get algorithmic. 

The Algorithm of 'how long will it take to get to work' is a ton of data collection and a bit of prediction. Netflix is a fun example to look at and say 'wow, our data sucks', but the article seems to insist that the process is broken, or perhaps the movies themselves? That people don't like the things that they say they should like? That people's likes should be simpler? That Netflix is building a Tower of Babel by daring to try to model what we enjoy?

> But the overall work of the Netflix recommendation system is distributed amongst so many different systems, actors, and processes that only a zealot would call the end result an algorithm.

Sounds like we need a definition. From http://en.wikipedia.org/wiki/Algorithm

> In mathematics and computer science, an algorithm (Listeni/ˈælɡərɪðəm/ al-gə-ri-dhəm) is a self-contained step-by-step set of operations to be performed.

Are those processes and systems documented? Are the concepts for forming and maintaining exceptions codified? To go a bit MythBusters, did you write down your process and results? I'd be amazed if the answer wasn't "Of course! Thats the point of science!" 

> First, it allows us to chalk up any kind of computational social change as pre-determined and inevitable. It gives us an excuse not to intervene in the social shifts wrought by big corporations like Google or Facebook or their kindred, to see their outcomes as beyond our influence. Second, it makes us forget that particular computational systems are abstractions, caricatures of the world, one perspective among many. The first error turns computers into gods, the second treats their outputs as scripture.

So where is the god in the movie database? Does Netflix give up when the purely computational algorithm fails them, and redefine success to be 'the best that a computer can give us, since computer is the best'? No, they tried it one way, found it didn't work, opened up to the Machine Learning world, found that didn't work, and expanded their abstractions and processes (I hope you hear 'algorithm' for the second, but not the first) until it did achieve the results that were desired. I'll bet you money that they haven't gotten their yet, but just keep inching closer, or at least less-far from their target. 

At least I have nothing to disagree about the second point - abstractions can be caricatures, but to draw on the earlier example given, so is 'time' - the mishmash of 'day' and 'year' is pretty gross (and gets worse once you notice that they are drifting, albeit slowly), but they serve as an incredibly useful abstraction - timekeeping and understanding of these (abstract) relationships brought us ocean-going navigation, by way of solving for your longitude based on the position of the sun at a given time. And electricity too:

> Until we think a little harder about the historical references Manovich invokes, such as electricity and the engine, and how selectively those specimens characterize a prior era. Yes, they were important, but is it fair to call them paramount and exceptional?

I'm not even sure how to respond to that. No, the discovery/invention of these tools didn't revolutionize anything, but their use and implementation surely did. Lights at night without burning? Long distance communication? Trains? Haven't the last several generations taken these for granted, and wouldn't they be as out of place before their invention as we now would without a cell phone? Do you know why the US Constitution has voters go to the polls in November, but inauguration waited until March?

Maybe I'm too deep into the sausage-making side of software, but while most users trust the mechanical side of software (delivering email, loading webpages, printing documents), they reserve a grain of salt for suggestions for movies or search results. I'd argue that clicking the 'next' button or even scrolling down the list of search results is an outright rejection of the search engine algorithm. Closing the page in anger might count too, but I'm willing to concede that one. 

Our brains love abstraction and instant classification - that's why we make snap judgements about things we may not understand. This isn't a new feature of our wetware, its not something arrived when Google did. No, we have found it to be a quick-and-dirty way to solve problems since we figured out vocal communication - that thing is a thing, but it has a noise, and when I make the noise, I call to mind the thing. The noise is not the thing, but it is an abstraction of the thing (caricature, if you must). We have all kinds of ways to abstract out the nitty-gritty details of reality, and this is not a bad thing. Computers are the next step in this process, and while they might have made it easier, they didn't cause it to come about.

The algorithm in all its forms should not be vilified because the abstraction is bad. "Google announced a change to its algorithm" - why? Because it was being gamed by those who truly would pervert it - and thats what the linked article is actually about, not Google making a change! Some countries ban or limit giant advertisements on the highway for the same basic reason, as they serve as a distraction from the task at hand. Are those legal changes ruining culture as we know it too, or is that just too far removed from 'a series of steps'?

I dunno. I think this could have perhaps gone interesting places if there was more about the "zealous devotion" and less about "algorithm = abstraction = bad". "And like any well-delivered sermon" indeed.
Add a comment...

Post has attachment
I agree on almost all points - the nature of the JS beast really does seem to encourage this amazing churn. 

One exception: Someone should probably let Google know that GWT's time has passed, and shouldn't be used for Inbox, Flights, Sheets, Groups...
Add a comment...
Wait while more posts are being loaded