You are viewing a single comment's thread from:

RE: How Much of the Rewards Pool is Paid out by BitBots Votes V's Organic Votes

in #utopian-io7 years ago

I also have some serious consideration of whether we ought to be trying to come up with a way to filter out "curation trains" of followed auto voting, but I haven't come up with a really good way of detecting that sort of event. My gut says that some sort of clustered timeseries would be about the only way to determine or detect a relationship between events like that, but it's hard to say for sure.

The system does not make it very easy to analyze votes by value and aggregate, so this is always a lot of fun.

Sort:  

That would be an interesting task: Creating a filter that corrects the upvote bot bias. If you are interested in developing such a filter, you should contact @greer184 who is running @q-filter, where he works on alternative ways of filtering content.

I've actually been trying to implement a filter that would simply bias presentation in favor of things that I have personally voted up in order to provide an additional form of weighting and presenting the content available on the blockchain, but that's been going slowly at best. I know that any kind of filtering is a big task.

In this case, because up vote bots are nearly impossible to detect if they're not on the big listings, I'm not sure that it's absolutely useful to trying correct for them specifically as opposed to individualizing the experience of a user who is already involved in seeking out content that they like and signaling to the system that they do. After all, it's theoretically possible that someone might like the kind of content that is consistently voted up by a bot. I hate to make that kind of assumption up front. It's theoretically possible that, at some point, someone might create a bot which consistently votes up content that I'm interested in. Theoretically.

(You really have a bunch of friends who like to follow you around and flag everything you do, don't you? I don't think that I've ever seen a comment with a reasonable content like that get so absolutely stepped on as hard as feet could go. It's really quite impressive. No one cares that I'm a thorn in their side so much that they follow me around quite so slavishly. Good job!)

I have been trying a type of market basket analysis in R - but getting it wrong so far.. Im trying something like if A votes for Y all the time, who else also votes for Y all the time.. Im very new to R, this will take me months to master lol

R Programming for Data Science by Roger D. Peng might be handy :-)

He and Jeff Leek have a dozen or so courses on R, stats and data science on coursera. I don't know how coursera works right now (haven't used it in years), but a few years back I did enjoy two courses by them:

  1. Exploratory Data Analysis
  2. R Programming

I'm doing a data science thiny with microsoft and EdX....time is my biggest constraint. but that you for the links because extra references are so handy to have, especially that book..nice :-)

I suspect that you are going to really enjoy working on analysis in R once you get your feet wet. Thinking about these problems from a procedural point of view really throws certain aspects into sharp relief. I find that it really tests my assumptions about what I should be seeing and expectation versus what I am seeing and why I'm seeing those things.

Though you have to be careful with the "Alice votes for Bob all the time, who else votes for Bob all the time?" form of inquiry, because it is perfectly reasonable for human beings to act like that. Especially on Steemit, where providers of anything outside of talk about cryptocurrency in general and steem in particular are rare, it is very easy for real communities of people to end up largely voting for each other if they are interested in the same niche subject.

But that's okay, because you would notice that very quickly once you started pulling those clusters out. This is how we learn.

The trick might be far simpler: You have to look at the transfers+memo as URL and if then in return comes an upvote to that URL, you have a bot working.The trick might be far simpler: You have to look at the transfers+memo as URL and if then in return comes an upvote to that URL, you have a bot at work. Of course I have no idea how to filter that reliably or squeeze it into R or anything else. ;-)

The real problem with doing it that way is trying to actually sort through that much data, because you have to have both all of the transfers and memos and all of the votes in order to possibly have a positive hit.

If these bot designers were smart, they would start requiring that the memo be sent with an encrypted hashtag at the beginning so that casual observation couldn't make out what the targeted URL is from outside the recipient. Some of them may be doing that; that's outside of my personal experience.

That is a lot of data to be slinging around the network, which is the problem I've been running into a lot lately. It might be possible, but it's definitely not a simple trick.

You caught me there, it's probably too simplistic what I would do:

  • you need the list with the transfer-amount+URL+timestamp1 to the bot
  • plus the list with the bot upvote-percentage+URL+timestamp2
  • then you create a table with the columns for URL, timestamp1, transfer-amount, upvote-percentage
  • then you fill the table with the first list
  • and after that you update the table by adding the 2nd list where the URL is the same and timestamp2 > timestamp1, because the transfer comes before the upvote
  • finally you delete all rows that have empty cells.

Done. But again: This is the approach of a lousy SQL amateur;-)

See, the problem is not this process, which is fairly straightforward – it's generating the list of transfer amounts and URLs along with bot up votes percentage and URL. In order to generate those lists in the first place, you have to do a fair amount of ugly digging and parsing.

It's that part that's really the issue. Figuring out what the signs of those things are and extracting them.

And then you have to do it for every single bot, which means that you have a fair number of transactions that are going to have to be hitting the server in order to straighten everything out.

It's a lot of data. And ultimately – I'm not sure that it really tells us anything that we don't already know.

It might actually be more efficient to simply query the lot of all posts made over the last week and have them give their active_votes attribute up and do all of the parsing on that. If nothing else it keeps the query simple.

It's theoretically possible that, at some point, someone might create a bot which consistently votes up content that I'm interested in. Theoretically.

lol... I know what you mean. My personal fear is that this happens to me with spammers, who's spam becomes so good that I simply give them their upvote;-)

A personalized filter would be something here. I would be already happy, if I could simply follow or mute tags as it is possible with users. In combination with a language filter, that would filter out at least 95% of the BS floating around. Scrolling through the rest of the available posts would be almost possible without having to make any further selection...

If you filter out bots (there are maybe 20 big&relevant ones), I think you'd still get a picture about preferences, but the result would probably be too flat, because smaller users don't get enough votes in the first place.

A filter I would imagine as interesting is one that works by recursive means like Google or the citation system Google is built after, but I'm not sure how exactly that should look like (by upvote, by comment etc). Unfortunately, I don't have the means to try;-(

On the downvotes: That comes from the user @berniesanders / @nextgencrypto. He's a bit of a menace here on Steemit and I'm trying to take him down and it looks like I have found his weak spot. Here's one of the posts I dedicated to him. Normally, he also posts a comment after my comments in which he calls me a Nazi. Looks like he has given that up..

I will follow you now.

Loading...