Joel David Hamkins
702 followers -
Mathematics and philosophy of the infinite
Mathematics and philosophy of the infinite

702 followers
Communities and Collections
View all
Posts
Post has attachment
Vopěnka showed that every set is in a forcing extension of HOD. So the set-theoretic universe V is the union of all these various set-forcing generic extensions HOD[G]. Can we unify all these forcing notions so as to realize the entire set-theoretic universe V as a class-forcing extension of HOD? That is, must V=HOD[G] for some class forcing notion definable in HOD and with definable forcing relations there?

The answer is no. The main result of this article, joint with Jonas Reitz, is that if ZFC is consistent, then there is a model of ZFC that cannot arise in that way as a class forcing extension of its HOD.
Post has attachment
Burak Kaya asked me the following interesting question: What do you get if you build the constructible universe, but at each stage you add only the sets that are definable without parameters over what you have constructed so far? That is, we iterated the parameter-free definable power set operation.

The answer is that you still get the full constructible universe, although the specific objects will arrive into the hierarchy a bit more slowly.

One way to see this is to observe that the L-least set that is missing at any stage will be definable without parameters, under that description, and so it will get added. If we always add the L-least set that is currently missing, then we will eventually add every set from L.

To explain in a little more detail: let L*_alpha be the new hierarchy, where L*_{alpha+1} consists of the parameter-free definable subsets of L*_alpha, and L* is the union of all these. (In contrast to the usual L hierarchy, where L_{alpha+1} consists of the definable subsets of L_alpha, allowing parameters.) The claim is that L* = L. Clearly, L* is contained in L, and if they are not equal, then there is some ordinal alpha with L_alpha contained in L*, but L_{alpha+1} is not contained in L*. Note that alpha is definable by this property in any sufficiently large L*_beta, where L_alpha is contained in L*_beta. And then also the least parameter used to define a set that is missing is also definable, and so the missing set is actually definable in L*_beta. So there are no missing sets and L* = L.
Post has attachment
There is a new entry in the Stanford Encyclopedia of Philosophy on Alternative Set Theories, written by Randall Holmes, which devotes a section to my views on set-theoretic pluralism and the multiverse, in the context of a survey of many different kinds of alternative set theory, from Zermelo to Ackermann to NF and more.

https://plato.stanford.edu/entries/settheory-alternative/#MultViewSetTheo

Post has attachment
Post has shared content
Black open access: triumph of the pirates

In gold open access, you write a paper and pay a big company lots of money to give it away for free. For some reason this isn't catching on.

In green open access, you publish your paper with a big company. They charge people to read it – but you make another version available for free. This has not caught on except in math and physics.

In diamond open access, you publish your paper with a journal that's free for you and free for the people who read it. This has also not caught on, because most "prestigious" journals – the ones you need to publish in to get a job – are run by companies who don't do stuff for free.

In black open access, people illegally download millions of papers and books from big companies and make them available to everyone for free. This is working great.

On Sci-Hub you can get at least 62 million papers for free. The publisher Elsevier sued them, and won a $15 million judgement against them, but they haven't gotten a penny and they probably never will. Their original IP address was blocked, but they're still easy to find. The lawsuit mainly brought them more publicity! Now they say 200,000 papers get downloaded each day. On LibGen you can get at least 52 million articles and tons of books. In 2015, a New York district court ordered them to shut down, but... they're still there, and easy to find. Now the so-called experts on open access are waking up to this fact. Here's the start of an article about this. It makes two basic points. One is that green and gold open access require lots of players to cooperate in changing their behavior, while black open access does not. Another is that bundling journals – selling them to libraries only in large groups – is slowing change. Yes, you can buy an individual paper from a journal, but it usually costs about$30. Why bother, when you can get it from SciHub or LibGen for free?

-----------------------------------------------------------------------------

We've failed: Pirate black open access is trumping green and gold and we must change our approach

Toby Green

Key points:

• Sci-Hub has made nearly all articles freely available using a black open access model, leaving green and gold models in its dust.

• Why, after 20 years of effort, have green and gold open access not achieved more? Do we need ‘tae think again’?

• If human nature is to postpone change for as long as possible, are green and gold open access fundamentally flawed?

• Open and closed publishing models depend on bundle pricing paid by one stakeholder, the others getting a free ride. Is unbundling a fairer model?

• If publishers changed course and unbundled their product, would this open a legal, fairer route to 100% open access and see off the pirates?

At the 2017 UKSG Conference, a show of hands during Stuart Lawson's plenary on ‘Access, ethics and piracy’ showed that barely anyone present blocked access to Sci-Hub or considered it should be blocked. What I find startling is not that a room full of UKSG delegates seem to be condoning piracy and supporting black open access, but 15 years on from the Budapest Open Access Declaration, a pirate site is needed at all. After all, the pirates have long since been chased out of the music business.

It is not as if the past two decades have been spent idly. Open access advocates have busily encouraged stakeholders into funding both high-profile ‘alt-publishing’ efforts and lower-profile more-or-less open access journals (e.g. https://doaj.org/) and brought forth a roar of policies and mandates that aim to oblige authors to change their publishing habits (https://roarmap.eprints.org/). All this accompanied by a controversy of sessions at conferences like UKSG, which have debated the many facets of open access to the point where conference organizers must be getting desperate to find an original angle. There is even a tool to help you see where a journal sits on an open access spectrum (http://www.oaspectrum.org). Let us also recognise that every single stakeholder in the scholarly communications industry claims to be supportive of open access, yes, including publishers, commercial or not, and have set up associations to help things along (e.g. https://sparcopen.org/ and https://oaspa.org/). Now, some European countries are trying a new approach: to demand of the major publishers nationwide open access contracts, such as Projekt DEAL in Germany.

Yet, while we have been bickering about the true path to open access nirvana, the pirates have crept up on us, especially in the form of Sci-Hub, which is self-reporting more than 60 million articles freely available and could have harvested nearly all scholarly literature – if true, Sci-Hub has single-handedly won the race to make all journal articles open access.

Set against this are the combined efforts of stakeholders in scholarly communications who, after two decades, have managed only to get around half the world's research articles open, with the rest still behind a paywall 3–4 years post-publication. If past performance is any guidance, around four-fifths of all new scholarly articles in 2017 will be unavailable for most people on publication via legal channels. It does not look impressive: black open access has trumped green and gold.

For books, despite initiatives like Open Access Publishing in European Networks (OAPEN), Knowledge Unlatched, and Open Book Publishers, progress has been glacial. At the time of writing, there are just over 8,000 titles listed in the Directory of Open Access Books (http://www.doabooks.org/), which – considering that Springer alone offers nearly 280,000 titles from its online bookshop – suggests that the proportion of books published open access has yet to reach 2%.

Not for the first time, pirates are delivering where the established players and legal channels are not (e.g. https://en.wikipedia.org/wiki/Radio_Luxembourg). To see off pirates, the music industry recognized that they had to shift from relying only on legal means to shut down the pirates and also revolutionize their business models so that it was more attractive to users than a free, pirate service. Not only did they find one with pay-to-download services, like iTunes, but they are now well on their way to complementing it with another service built around streaming.

The evidence above says that green and gold open access models are not the revolutionary business models we need because, if they were, then they would have >80% market share already, the pirates would be looking elsewhere for opportunities, and I would not be writing this piece. True, there are some sustainable open access successes like BioMed Central, PLoS, and arXiv, but their share of all articles, let alone books, remains marginal. To see off the pirates (and to nick a line from O Flower of Scotland), we need tae think again.

So, why have green and gold failed? I do not blame political leaders, the power of monopolies, or unwilling academics. Rather, I think that these models are flawed for two key reasons: change and bundle pricing.

-----------------------------------------------------------------------------

For more, read this - it's open-access:

http://onlinelibrary.wiley.com/doi/10.1002/leap.1116/full
Post has shared content
We need more distilleries.
Climbing the mountains of mathematics

Chris Olah and Shan Carter write:

Achieving a research-level understanding of most topics is like climbing a mountain. Aspiring researchers must struggle to understand vast bodies of work that came before them, to learn techniques, and to gain intuition. Upon reaching the top, the new researcher begins doing novel work, throwing new stones onto the top of the mountain and making it a little taller for whoever comes next.

Mathematics is a striking example of this. For centuries, countless minds have climbed the mountain range of mathematics and laid new boulders at the top. Over time, different peaks formed, built on top of particularly beautiful results. Now the peaks of mathematics are so numerous and steep that no person can climb them all. Even with a lifetime of dedicated effort, a mathematician may only enjoy some of their vistas.

People expect the climb to be hard. It reflects the tremendous progress and cumulative effort that’s gone into mathematics. The climb is seen as an intellectual pilgrimage, the labor a rite of passage. But the climb could be massively easier. It’s entirely possible to build paths and staircases into these mountains. The climb isn’t something to be proud of.

The climb isn’t progress: the climb is a mountain of debt.

Shan Carter and Chris Olah are serious about building a new institution to help tackle this problem. They call it research debt: research builds up faster than it gets clarified and explained.

Their project is called Distill. They seem to be focused on machine learning, not mathematics. But the problem affects every area of research.

They explain some causes of research debt:

Poor ExpositionOften, there is no good explanation of important ideas and one has to struggle to understand them. This problem is so pervasive that we take it for granted and don’t appreciate how much better things could be.

Undigested IdeasMost ideas start off rough and hard to understand. They become radically easier as we polish them, developing the right analogies, language, and ways of thinking.

Bad abstractions and notationAbstractions and notation are the user interface of research, shaping how we think and communicate. Unfortunately, we often get stuck with the first formalisms to develop even when they’re bad. For example, an object with extra electrons is negative, and pi is wrong. [The really important number is 2pi - jb]

NoiseBeing a researcher is like standing in the middle of a construction site. Countless papers scream for your attention and there’s no easy way to filter or summarize them. We think noise is the main way experts experience research debt.

The insidious thing about research debt is that it’s normal. Everyone takes it for granted, and doesn’t realize that things could be different. For example, it’s normal to give very mediocre explanations of research, and people perceive that to be the ceiling of explanation quality. On the rare occasions that truly excellent explanations come along, people see them as one-off miracles rather than a sign that we could systematically be doing better.

To tackle research debt, we need more distillers:

Research distillation is the opposite of research debt. It can be incredibly satisfying, combining deep scientific understanding, empathy, and design to do justice to our research and lay bare beautiful insights.

Distillation is also hard. It’s tempting to think of explaining an idea as just putting a layer of polish on it, but good explanations often involve transforming the idea. This kind of refinement of an idea can take just as much effort and deep understanding as the initial discovery.

This leaves us with no easy way out. We can’t solve research debt by having one person write a textbook: their energy is spread too thin to polish every idea from scratch. We can’t outsource distillation to less skilled non-experts: refining and explaining ideas requires creativity and deep understanding, just as much as novel research.

Research distillation doesn’t have to be you, but it does have to be us.

They're trying to build an ecosystem that supports distillers:

Like the theoretician, the experimentalist or the research engineer, the research distiller is an integral role for a healthy research community. Right now, almost no one is filling it.

Why do researchers not work on distillation? One possibility is perverse incentives, like wanting your work to look difficult. Those certainly exist, but we don’t think they’re the main factor. Another possibility is that they don’t enjoy research distillation. Again, we don’t think that’s what’s going on.

Lots of people want to work on research distillation. Unfortunately, it’s very difficult to do so, because we don’t support them.

An aspiring research distiller lacks many things that are easy to take for granted: a career path, places to learn, examples and role models. Underlying this is a deeper issue: their work isn’t seen as a real research contribution. We need to fix this.

To see how they're trying to fix it, go here:

https://distill.pub/2017/research-debt/

I thank for pointing this out! He ran into this while studying machine learning.

I believe the picture is from Nepal.
Post has attachment
Post has shared content
The time is coming soon.
Günter Ziegler (Free University of Berlin) on negotiating with Elsevier:

It's like you're at a car dealer trying to buy a car, but the salesperson keeps trying to sell you a carriage....You tell him "I don't want a carriage, I want a car." And he says: "Well if you buy this carriage, we'll give you this horse for free."