Advertisement
by Wallenburg » Tue Jun 14, 2022 3:13 pm
by Reploid Productions » Tue Jun 14, 2022 3:59 pm
Wallenburg wrote:I think discussion on the matter of HTML script legality would benefit immensely if it were clear whether admin has already decided to kill the R/D game or is merely entertaining the idea. Is any amount of discussion here liable to actually convince admin against such a drastic course of action over such a nothingburger, or are we discussing ways to conduct damage control once a ban is implemented?
[violet] wrote:Maybe we could power our new search engine from the sexual tension between you two.
by [violet] » Tue Jun 14, 2022 7:16 pm
Flanderlion wrote:Are the QoL improvements for individuals using their own scripts worth the admin time to police it? Like, sometimes something is a good idea (allowing scripts on the HTML to improve QoL stuff) turns into more work than it's worth. Would it really be the end of the world for a blanket ban, and enforce it only when it goes into mod/admin attention.
Roavin wrote:Once again, [v], thank you so much for coming into this thread and giving us this information - it might not seem like it this way for some at first glance, but as the primary tech dude within this thread, your post answers a lot of extant questions that we were having, and (for me at least) vindicates having put the effort into all of this.
Roavin wrote:There is an inconsistency in documentation - Elu's suggested identification via URL-Parameters does not include a mention of contact-info, but that example is what Reliant used as an example to identify itself. With your post, we now know that this was incorrect, as that lacks contact info, as mentioned by OSRS Script Rules. We'll fix that in Reliant immediately, but it might be worth either amending Elu's post or, even better, amending that part of the OSRS script rules to include a "correct" example. (@NS Staff generally: what's the best way to report this, beyond writing it here?)
Roavin wrote:On the bots being banned: You've implied several times that you do this with some frequency, but I haven't witnessed it in the circles I frequent; even with Storm, the unnamed TG tool, and now Reliant, no ban took place but instead other measures; and I know of at least a magnitude more than 3 tools doing a variety of things that have not gotten blocked or banned or whatnot. Is it possible that you have just a bit of reinforcement bias, since the scripts you deal with are the bad ones and the good ones just do their thing as they should?
Roavin wrote:[violet] wrote:Incidentally the script spams (or spammed) requests like this at the rate of ~10 reqs/second -- I don't know what it's doing, but it looks to be requesting the exact same data over and over, which is behavior I usually interpret as a broken bot and block via CloudFlare.
- Code: Select all
xxx.xxx.xxx.xxx - - [06/Feb/2022:22:25:39 -0800] "GET /template-overall=none/page=reports/script=reliant_1.3/userclick=1644215140121 HTTP/1.1" 200 772 "https://www.nationstates.net/template-overall=none/page=blank/reliant=main" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.82 Safari/537.36"
That's normal behavior; the difference here is that usually it's R/Ders just manually F5 the normal reports page, and this is Reliant doing it through a keybind and parsing the result for presentation to the user.
by United Calanworie » Tue Jun 14, 2022 7:43 pm
[violet] wrote:Roavin wrote:
That's normal behavior; the difference here is that usually it's R/Ders just manually F5 the normal reports page, and this is Reliant doing it through a keybind and parsing the result for presentation to the user.
Are you saying it's normal behavior for a scriptless R/Der to F5 the reports page ten times per second? I struggle to see this as it means they have 100ms to send the request to the server, have it processed, get it back, and then, with their human brain, make a decision about it. Where I am in the world, it takes 250ms - 800ms just to get a response from the reports page, even with 'template-overall=none', so if I'm pressing F5 ten times a second, all I'm doing is canceling every previous request before it completes -- I never see any data at all. In an ideal scenario where I'm physically located very close to the server and it's not congested, I might get a response as fast as 50ms, after which my browser can begin to paint, so it's conceivably possible -- but I have to read & process the data I get back in 1/20th of a second.
by Esfalsa » Tue Jun 14, 2022 8:03 pm
by Refuge Isle » Tue Jun 14, 2022 8:15 pm
by [violet] » Tue Jun 14, 2022 11:36 pm
by Roavin » Tue Jun 14, 2022 11:41 pm
[violet] wrote:Are you saying it's normal behavior for a scriptless R/Der to F5 the reports page ten times per second? I struggle to see this as it means they have 100ms to send the request to the server, have it processed, get it back, and then, with their human brain, make a decision about it. Where I am in the world, it takes 250ms - 800ms just to get a response from the reports page, even with 'template-overall=none', so if I'm pressing F5 ten times a second, all I'm doing is canceling every previous request before it completes -- I never see any data at all. In an ideal scenario where I'm physically located very close to the server and it's not congested, I might get a response as fast as 50ms, after which my browser can begin to paint, so it's conceivably possible -- but I have to read & process the data I get back in 1/20th of a second.
[violet] wrote:We actually identified that very early -- at first I deemed the traffic illegal based on the lack of UserAgent, but Elu pointed out that he'd advised people that the URL could be an acceptable substitute under certain conditions. We then talked about updating the Script Rules to reflect this. They still are not updated, though, because there is a lot of discussion about where we're going with script rules in general. I didn't check whether Reliant met Elu's conditions to qualify for not using a UserAgent, but I assumed it did, since there's not much reason to avoid it otherwise.
2. Identify your script via User Agent
You must identify your script with every request it makes to the site. This identification must at the very least include the name of your script, the version of your script, and a means to contact you, the script author. The contact information could be a nation name, an email address, or your website's URL, and allows us to contact you if something goes wrong, and give you a chance to fix it. In nearly all cases, the identification should occur with the User-Agent HTTP header. If that is not possible due to technical limitations, it's acceptable to instead pass the identification via the URL Parameter "script".
If the script is performing an action based on direct user input (see above), it should include a URL Parameter "userclick" with a UTC timestamp in milliseconds containing the time the user clicked the button.
As an example of this rule, in a Greasemonkey/Tampermonkey script responding to user input, one could do the following to any URLs they issue:
- Code: Select all
// append script name, version, and author
url = url + "/script=exampleScript_"+GM_info.script.version + "_by_Testlandia";
// append user click timestamp
url = url + "/userclick="+Date.UTC();
Flanderlion wrote:Are the QoL improvements for individuals using their own scripts worth the admin time to police it? Like, sometimes something is a good idea (allowing scripts on the HTML to improve QoL stuff) turns into more work than it's worth. Would it really be the end of the world for a blanket ban, and enforce it only when it goes into mod/admin attention.
by Roavin » Tue Jun 14, 2022 11:47 pm
[violet] wrote:Thanks for the above. Somewhere around 99% of players live at least 40ms away from our server, which stacks on top of page generation time, so 10 requests per second still strikes me as very hard to achieve by someone without a script to prevent F5s from becoming simultaneous and stomping on each other. But I understand the point about refreshing quickly and allowing your slow human brain to process data from several refreshes ago.
Just so I understand, with these requests for the reports page, Reliant is simply acting as a keybind? It's not doing anything special with the data returned?
by Trotterdam » Wed Jun 15, 2022 12:26 am
It looks like all of the differences except for the "page load time" addition could be handled using a custom CSS sheet, which isn't even a script in any sense the word is normally used?Roavin wrote:Left is Breeze's augmented reports page, right is stock Reports page:
by Roavin » Wed Jun 15, 2022 12:51 am
by [violet] » Wed Jun 15, 2022 12:55 am
Roavin wrote:I would add two more here, from my perspective at least, that couldn't be easily sorted into category two. The first is accessibility
Roavin wrote:The second is QoL improvements to the existing interface. For example, the old Telescope augmented the site UI to include buttons that would allow endorsing or moving directly from any nation or region link. I know that raiders have scripts that do the same with ejection buttons. TheI'm less familiar with other areas of the game (i.e. cards or such), but I'm told that similar kinds of things appear there as well. These aren't really accessibility features or even necessary to play the game, but they make things much easier and convenient. I'm not sure how such UI augmentation could be possible through the API.
by Roavin » Wed Jun 15, 2022 1:53 am
[violet] wrote:Thanks, and I understand your explanations, but I'm having trouble reconciling them with the conclusion. It seems to me that due to the reasons we've discussed, there are only limited circumstances in which a script-less user would refresh 10 times per second: in particular, they have to live next door to the origin server, and have excellent cadence and lightning reflexes. Which I don't think is normal. Whereas there are a couple of reasons why that behavior would be normal for Reliant users: no need to worry about cadence, and benefiting from whatever additional processing Reliant provides that you allude to in your second post.
In the server logs, I see Reliant sends plenty of very high-velocity page requests, whereas non-Reliant traffic to the same pages tends to be much slower. Now this might be explained if we accept that people using Reliant are precisely those who would be fast-clicking without it anyway.
[violet] wrote:But that strikes me as a bit of a stretch -- I accept it to a degree, but I don't know if I can at 10reqs/s. I want to make sure I don't misunderstand -- is that actually your assertion, that Reliant has nothing to do with that behavior, as it's normal for R/Ders without scripts too?
[violet] wrote:Roavin wrote:I would add two more here, from my perspective at least, that couldn't be easily sorted into category two. The first is accessibility
I still don't see anything that couldn't be fetched from the API, though. The Breeze pic in particular is so different from the stock NS reports page that I really can't fathom what it gains by hitting the HTML site -- other than avoiding the API Happenings delay, of course. The content is slower to generate, liable to change format without warning, and Breeze throws away the HTML formatting it comes wrapped in... so what is the unavoidable need to fetch it from the HTML site, rather than consume XML or JSON from an API? Or, equally, when it's sending commands, why is it essential to send those commands to the HTML site, not the API?
[violet] wrote:If the answer is just "the API Happenings delay," then that's what I was referring to earlier, where it's a script that hits the HTML because it wants to do something the API doesn't currently permit. And, to be clear, I'm not saying I want to force all HTML scripts to eat that API delay. I'm saying we would have a lot of questions about what kind of delay there should be, if any, for this and other new API endpoints.
[violet] wrote:So these range from simple themes and stylesheets (no concern to admin at all), to tools that add or move around buttons (generally ok), to tools that handle executing requests and processing the returned data (our area of concern). None of them bar the last are currently hitting the HTML site, right? So they're not the subject of discussion. And the last kind, which is much closer to a bot than a UI augmentation, is just as capable of querying the API as it is of querying the HTML site, as far as I can see.
by SherpDaWerp » Wed Jun 15, 2022 4:20 am
[violet] wrote:So these range from simple themes and stylesheets (no concern to admin at all), to tools that add or move around buttons (generally ok), to tools that handle executing requests and processing the returned data (our area of concern). None of them bar the last are currently hitting the HTML site, right? So they're not the subject of discussion. And the last kind, which is much closer to a bot than a UI augmentation, is just as capable of querying the API as it is of querying the HTML site, as far as I can see.
by [violet] » Wed Jun 15, 2022 4:21 pm
Roavin wrote:Absolutely, plus there could be other ideas to efficiently transmit changes through the API (for example, a Websocket endpoint for activity log without requiring costly poll requests). This would make tooling better for sure, plus be better for servers, so I'm with you on this, that this could certainly be a desirable future for everybody involved.
by [violet] » Wed Jun 15, 2022 4:28 pm
SherpDaWerp wrote:If there was a bit more specific wording employed (i.e. less "ban html scripts" and more "ban html request-making scripts") or some Official Wording about what's actually being considered, that would quell a lot of fears in the Cards community. It's a bit like putting the cart before the horse, I know, drafting a wording for a rule that you're not even convinced you need yet, but if there was a specific "admin do not intend to ban your QoL tool" post to point at, there'd be a few less Cards voices against this change.
by [violet] » Wed Jun 15, 2022 4:39 pm
Roavin wrote:How would you categorize Telescope's Endorse button, which is injected for every nlink and calls endorse.cgi when clicked by the user?
by [violet] » Wed Jun 15, 2022 6:52 pm
Ever-Wandering Souls wrote:It's unfortunately almost impossible to get even objectively simple API calls added from the requests thread for them, much less complex new ones that replace things done via HTML scripting at present, /much less/ things like an account/puppet manager/login switcher that admin has repeatedly already said is infeasible. =/
by Sandaoguo » Wed Jun 15, 2022 7:13 pm
[violet] wrote:So yes, the API thread is full of people asking for things and not getting them from admin, and it's reasonable to worry that this will be the case for supposed new endpoints. But I think people should realize that the API is very mature in its existing offerings, and a lot of new requests are either fairly niche or else major features that are only worth coding if we can be confident that script authors will actually use them.
by [violet] » Wed Jun 15, 2022 9:45 pm
Sandaoguo wrote:Things that are simply not feasible to do through the API alone with the low rate limits should be high priority, for example. An endpoint that returns all endorsable nations in a region for X input would replace the endoswapping tools that rely on a complex combo of daily dumps, rate-limited API calls, or page scraping.* In the past, we've basically been told that status quo is fine, but it's that kind of necessary added complexity that turns people off from using the API.
by Roavin » Thu Jun 16, 2022 3:27 am
[violet] wrote:Roavin wrote:Absolutely, plus there could be other ideas to efficiently transmit changes through the API (for example, a Websocket endpoint for activity log without requiring costly poll requests). This would make tooling better for sure, plus be better for servers, so I'm with you on this, that this could certainly be a desirable future for everybody involved.
I have already built a basic but functional event-based API for Happenings, which runs on Node and notifies listening clients via SSE. It would be approximately a million times better than the current situation of scripts polling for HTML multiple times per second.
[violet] wrote:Roavin wrote:How would you categorize Telescope's Endorse button, which is injected for every nlink and calls endorse.cgi when clicked by the user?
I'm not familiar with that tool. If it's only adding dumb buttons that are found elsewhere on the page or site, it's maybe just restyling. If it's dynamically crafting URLs, or managing the sending and receiving of data, it's a script that could use the API.
by Sandaoguo » Thu Jun 16, 2022 8:01 am
[violet] wrote:Sandaoguo wrote:Things that are simply not feasible to do through the API alone with the low rate limits should be high priority, for example. An endpoint that returns all endorsable nations in a region for X input would replace the endoswapping tools that rely on a complex combo of daily dumps, rate-limited API calls, or page scraping.* In the past, we've basically been told that status quo is fine, but it's that kind of necessary added complexity that turns people off from using the API.
Sorry to zero in on this part and ignore your questions, but I'm not proposing to add endpoints that offer functionality that isn't currently available on the HTML site either. That kind of thing might be looked at, especially where it lets everyone cut down the number of requests needed to do something, but it wouldn't be a high priority. The priority would be adding API support for things you currently do on the HTML site.
by [violet] » Thu Jun 16, 2022 6:42 pm
Sandaoguo wrote:If you're going to ban HTML scripts and tell devs to use the API, then you need to be willing to add in API endpoints for common use cases.
by [violet] » Thu Jun 16, 2022 6:50 pm
by Refuge Isle » Thu Jun 16, 2022 8:06 pm
[violet] wrote:So fyi here are what I think are some reasonable concerns people might have about being shifted to the API:I suspect some of the above might be behind arguments that scripts are somehow technically unable to interact with api.nationstates.net rather than www. I want it known that it's fine to hold the above concerns, and we can discuss them.
- Admin might not actually get around to adding the API endpoints needed
- The API Happenings delay might not be removed
- Once admin can see what everyone's scripts are doing, new restrictive rules might be added
- People might be outcompeted by fully-automated API bots
Advertisement
Users browsing this forum: The Second Psutasia
Advertisement