Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[php server] Using local HTTP server causes errror "B HTTP read timeout" #3

Open
gazsp opened this issue May 19, 2021 · 15 comments
Open

Comments

@gazsp
Copy link

gazsp commented May 19, 2021

Firstly, thanks for your work on this. It's very useful!

I think I've found a bug which seems to occur only with very small downloads. I was trying to download the following file, which was on my local web server:

https://github.com/em00k/next-zxdb-downloader/blob/main/zxdb-dl/zxdb-loader.bas?raw=true

I have a PHP server running locally, with the files from here in the root directory:

https://github.com/em00k/next-zxdb-downloader/tree/main/zxdb-dl

Downloading zxdb-dl.bin to the Next worked fine (81Kb), but downloading zxdb-loader.bas (188b) failed.

The command I used to serve the files was:

php -S 0.0.0.0:8000 -t ./

The command I used to get the file on the Next was:

.http -h 192.168.0.6 -p 8000 -u /zxdb-loader.bas -f zxdb-loader.bas

I see the server return a 200 for the request on my laptop, but http on the Next the fails with "B HTTP read timeout".

I've put the code in a basic program, and retried the download using ON ERROR, but the download never succeeds. The Next is running at 28Mhz, by the way.

I've not experimented to find out what the minimum file size is that will work - that's next on my list.

Let me know if there's anything I can do to help, or if you need any more information.

@gazsp
Copy link
Author

gazsp commented May 19, 2021

Looks like anything less than 64K (!!) throws up the issue... which makes me think it's a problem my end.

I'll look in to it some more...

@remy
Copy link
Owner

remy commented May 19, 2021

A few questions that can help me debug:

  1. Are you running on hardware or an emulator (if so which)
  2. Do you know your ESP versions - and can you paste here (to get them, on the spectrum run: .uart then AT+GMR)
  3. Can you run the following and paste the result you see (from your computer rather than the speccy):
curl -i -X GET 192.168.0.6:8000/zxdb-loader.bas

I'll do some local tests too to see if I can replicate.

@gazsp
Copy link
Author

gazsp commented May 19, 2021

I've 'fixed' the issue locally - I just send extra data from my local server after the file itself, the make sure to flush everything...

<?php

// Save a router.php and run with:
// php -S 0.0.0.0:8000 router.php

@ini_set('zlib.output_compression', 0);
@ini_set('implicit_flush', 1);

$filename = $_SERVER["SCRIPT_FILENAME"];

header("Content-Type: application/octet-stream");
header("Content-Length: " . filesize($filename));

$fh = fopen($filename, 'r');
fpassthru($fh);
fclose($fh);

// Feed the client 64k of garbage to flush it's buffers (maybe???)
for ($i=0; $i < 65536; $i++) { 
	echo " ";
}

for ($i = 0; $i < ob_get_level(); $i++) { ob_end_flush(); }
ob_implicit_flush(1);
flush();

return true;

This isn't a 'proper' fix, as I'm sending more data than I should need to. I need to test a bit more with remote files as well to see how that fares.

I'll debug the issue a bit more over the next few days, and will send over the info you've requested.

Thanks again.

@gazsp
Copy link
Author

gazsp commented May 19, 2021

P.S. The file arrives on the Next with correct size, as it correctly used the Content-Length header - the extra bytes are just discarded. This isn't ideal, but I want to send stuff locally directly to the Next, so this'll work for now. I will get more info across to you though...

@remy
Copy link
Owner

remy commented May 19, 2021

If I have time I'll fire up a local php server and run some tests too. The way .http works is that it closes the connection once it has all the bytes it expected, so I can imagine you might have to flush, but I wouldn't expect it to need more bytes to sort of… push the data over the wire 🤔

Still, glad you've got a work around 👍

@remy remy changed the title Small downloads cause "B HTTP read timeout" [php server] Small downloads cause "B HTTP read timeout" May 19, 2021
@gazsp
Copy link
Author

gazsp commented May 19, 2021

Yeah, I'm not happy with my 'solution' (re: hack). The 64K size thing bugs me as well. Also, I tested with a Python local server - just not to be biased against PHP. The issue was the same - any file below 64Kb causes the issue on the Next - feels like a local networking issue to me - probably nothing to do with you in all honesty!

Anyway, here's the info you requested:

  1. Running on real hardware - this: https://ultimatemister.com/product/zx-next-official-clone-board/
  2. AT v1.1.0.0 (May 11 2016), SDK v1.5.4 (Compile time May 20 2016)
  3. See below:
❯ curl -i -X GET 192.168.0.6:8000/zxdb-loader.bas
HTTP/1.1 200 OK
Host: 192.168.0.6:8000
Date: Wed, 19 May 2021 20:14:34 +0000
Connection: close
X-Powered-By: PHP/7.1.33
Content-Type: application/octet-stream
Content-Length: 188

Old firmware maybe? Scratch this - as the ZXDB downloader software works fine. Pretty sure it's something to do with my home set up. Going to try .http on some remote URLs with small file sizes in a bit.

@remy
Copy link
Owner

remy commented May 19, 2021

The curl and versions look fine - older firmware, somehow, actually behaved better (we found that 1.6.0.0 introduced a pretty serious bug which I had to code around in the end!)

There's a capture-esp.bas file in this repo that requests 4K from one of my servers and then captures a bunch of debug information at the same time. This might be useful as it gives me eyes into the ESP exchange. If you want to give that a go and send me the resulting 4k-esp-bank.bin (or email it to me at remy@remysharp.com) then I can take a look to see the detail.

@gazsp
Copy link
Author

gazsp commented May 19, 2021

Thanks Remy. I'll get back on this in the next few days and provide you with the info you've requested.

What's weirder - I've done a copy $filename to #2 to dump the downloaded file to the screen, and part of the UART data has been saved to the start of the file. The corruption is always the same, +IPD,1460: prepended to the start of the file (I've checked a couple of files now - they might even be corrupted all the way through - I need to check).

IMG_0335

In this case, the file is a .scr file. I'll email over the original + received file at some point (will do a diff myself as well).

@gazsp
Copy link
Author

gazsp commented May 19, 2021

Please find attached a zip containing the .scr from the server, the received version on the Next, and 4k-esp-bank.bin.

Archive.zip

@remy
Copy link
Owner

remy commented May 19, 2021

Ah, it looks like I broke my own debugging tool! The bank dump was empty - it should have had useful info in it 🤦.

I'll get my debugging tool fixed (oh, you didn't modify the capture-esp.bas did you? it uses a debug build called http-debug.dot in the same directory - I forgot to mention that).

If the +IPD is in your file, this tells me that it's dropping data. The debug bank will help me confirm, but I suspect that's the case, but what's very strange is that it's not dropping for larger files…

Can I assume you're running the latest build of .http?

@gazsp
Copy link
Author

gazsp commented May 20, 2021

I grabbed the latest code from here before running anything.capture-esp.bas failed for me first of all as I didn't have the debug dot command in place, I made sure it was in place before running it the second time. I didn't modify the bas file in anyway (I had a look at the code to see which line failed then realised it used the debug dot file, which I dropped in place).

I've just tried with a different ESP module, and the result is the same (which also has the same firmware version it seems).

However, when using a hostname + port for the same file on a remote server - the file downloads perfectly (as you'd expect!)

Have you been able to reproduce the issue on your local network? I'm still wondering if it's an issue with http itself or something to do with my network setup (networking - always such fun!)

I'm happy to carry on looking in to this. I'd like to get it working locally as it would be an ideal way of pushing code to my Next - just drop the files on a local web server and grab it at the other end (without fiddling with SD cards etc). Not so ideal if I need to upload the files to the web first to download them!

Anyway, thanks for taking the time to look in to this. It's much appreciated. Let me know if there's anything else I can do to help.

@remy
Copy link
Owner

remy commented May 20, 2021

What you're doing is all correct - looks like I broke my own test build! I'll fix that up and send it back to you for insights.

What I believe is happening under the hood is the the ESP is flushing messages too quickly for the machine to keep up, and so it overshoots the packet marker (the +IPD thing). Which then means it slurps up the packet marker into your file and then when it's trying to keep reading, the socket has closed - which leads to the read timeout.

What I don't understand (yet) is why having a larger buffer helps (though now thinking, it might be it allows the code to get enough data, but it's still missed packets and it leaves you with a corrupted file).

@gazsp
Copy link
Author

gazsp commented May 20, 2021

I've just double checked, and yes, larger files (>= 64Kb) do contain the IPD packet markers, so nothing is transferring reliably locally for me - larger files appear to work, but always have the +IPD,1460: prepended to them.

@remy
Copy link
Owner

remy commented May 25, 2021

Just to let you know I'm still on this - I need to fix up my debug build so it can get us the right information.

@gazsp
Copy link
Author

gazsp commented May 25, 2021

No worries - I appreciate it.

@gazsp gazsp changed the title [php server] Small downloads cause "B HTTP read timeout" [php server] Using local HTTP server causes errror "B HTTP read timeout" Aug 3, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants