Datasets:
Filtered Cebuano?
5,647,436 articles for cebuano is... Strange at best. Has anyone looked to remove LSJBot spam?
A simple filter that should work:
drop any row where "|Lsjbot|" in row['wikitext']
A simple filter that should work:
drop any row where"|Lsjbot|" in row['wikitext']
...Filtering after the fact does not undo the s3 downloads cost, storage footprint or the other noise that is not lsjbot.
One could trivially apply the "|Lsjbot|" in row['wikitext'] filter. That isn't the problem. I find the lack of attention to detail in pursuit for getting out datasets faster is concerning. Calling this current release as "finewiki" feels like an insult to those that took the time to examine wikipedia in detail, identify spam / quality patterns and then pruning those (Like what I've did at featherless).
I additionally have the same concerns for fineweb 1 and fineweb 2. There is a lot of content which I would consider as spam not filtered out.
Appreciate the pointers! We spent hundreds of thousands of GPU hours on ablations for fineweb 1 and 2, but I'm more than happy to take a look once you release your drastically improved version