dang
We do filter duplicates, but only when the story has had significant attention in the last year or so. This is in the FAQ: https://news.ycombinator.com/newsfaq.html#reposts.

That leaves two cases where the dupe detector is deliberately left porous [1]. One is if the story hasn't had significant attention yet. We want good stories to get multiple chances.

The other case is when a story has had significant attention, but not in the last year or so. In that case we want to allow reposts because some stories are of perennial interest and a year (or so) is enough time for the hivemind cache to clear. Also, it's good for newer cohorts of users to get exposed to the perennials and classics. That's part of how the culture propagates.

[1] https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...

oblib
I've posted links that got buried and shortly after the same link, or the same topic from a different source, made the 1st page.

I posted a link to an article on cannabis preventing Covid infections just a bit ago and it was buried in just a few minutes on "New" page.

I do a search for a URL before I post a link here and only post it if it's not been submitted already, but often it gets buried and someone else will post the same link a bit later and it makes it to the 1st page. But I don't feel link I've been cheated when that happens.

I suppose dang could write a routine to check for that but it'd probably take a lot of resources and not really improve anything for the end user. I think what gets noticed here is a bit random and may be curated. For example, my link to the cannabis article is interesting to me, but it barely skirts the edges of the main focus on computer tech here.

mindcrime
Because they very explicitly want to allow multiple discussions of the same item, spread over time?

Normally, submitting an exact duplicate during a certain window (I don't know how long that window is) automatically translates submissions after the first, into upvotes for the first.[0] After that window, dupes are considered OK because $REASONS, where I presume the reasons include something like "some stories are some timeless and interesting that they deserve to be discussed multiple times", or a variation of the thinking behind XKCD #1053[1], and so on.

See, for example:

https://hn.algolia.com/?dateRange=all&page=1&prefix=true&que...

[0]: Or at least that used to be the case. And I've seen it first hand, so can definitely confirm that this sometimes happens. Or used to. But I tried one just now with something that's on the "newest" page and even though I submitted it with the exact same URL, the submission went through as a separate one. Not sure if something's broken, or if that is intentional behavior, perhaps there's more complexity to the rule about when the de-dupe behavior is displayed than I was aware of. Not sure. Weird.

Addendum: tried a bunch more just now to see if maybe there's a pattern where some URL's trigger the de-dupe stuff and others don't, but everything I tried went through. My guess is that either the mods changed this intentionally for reasons that only they are privy to, or the de-duper is broken. But that's just a guess.

[1]: https://xkcd.com/1053/

phillipseamore
A limit on identical URLs for a certain period would be nice. Same URL can't be posted for a few hours after the first post. Repeats are fine here, so long as they aren't within a small timespan.

It could also mitigate this if when submitting the user would receive back a list of other submits with the same URL and asked to confirm that they still want to submit.

ZeroGravitas
I think they were until recently and there may be some issue with the search system not showing recent submissions, which is causing the dupe detector to not work as well.
theandrewbailey
Don't know why. Seems like a bug to me, and it occurs often:

SBF Missed FTX’s Risks (bloomberg.com) 2 points by latchkey 53 minutes ago | flag | past | 1 comment

SBF Missed FTX’s Risks (bloomberg.com) 1 point by ioblomov 57 minutes ago | flag | past | 1 comment

https://news.ycombinator.com/from?site=bloomberg.com

@dang: help?

sr.ht