-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[php server] Using local HTTP server causes errror "B HTTP read timeout" #3
Comments
Looks like anything less than 64K (!!) throws up the issue... which makes me think it's a problem my end. I'll look in to it some more... |
A few questions that can help me debug:
I'll do some local tests too to see if I can replicate. |
I've 'fixed' the issue locally - I just send extra data from my local server after the file itself, the make sure to flush everything... <?php
// Save a router.php and run with:
// php -S 0.0.0.0:8000 router.php
@ini_set('zlib.output_compression', 0);
@ini_set('implicit_flush', 1);
$filename = $_SERVER["SCRIPT_FILENAME"];
header("Content-Type: application/octet-stream");
header("Content-Length: " . filesize($filename));
$fh = fopen($filename, 'r');
fpassthru($fh);
fclose($fh);
// Feed the client 64k of garbage to flush it's buffers (maybe???)
for ($i=0; $i < 65536; $i++) {
echo " ";
}
for ($i = 0; $i < ob_get_level(); $i++) { ob_end_flush(); }
ob_implicit_flush(1);
flush();
return true; This isn't a 'proper' fix, as I'm sending more data than I should need to. I need to test a bit more with remote files as well to see how that fares. I'll debug the issue a bit more over the next few days, and will send over the info you've requested. Thanks again. |
P.S. The file arrives on the Next with correct size, as it correctly used the Content-Length header - the extra bytes are just discarded. This isn't ideal, but I want to send stuff locally directly to the Next, so this'll work for now. I will get more info across to you though... |
If I have time I'll fire up a local php server and run some tests too. The way Still, glad you've got a work around 👍 |
Yeah, I'm not happy with my 'solution' (re: hack). The 64K size thing bugs me as well. Also, I tested with a Python local server - just not to be biased against PHP. The issue was the same - any file below 64Kb causes the issue on the Next - feels like a local networking issue to me - probably nothing to do with you in all honesty! Anyway, here's the info you requested:
|
The curl and versions look fine - older firmware, somehow, actually behaved better (we found that 1.6.0.0 introduced a pretty serious bug which I had to code around in the end!) There's a |
Thanks Remy. I'll get back on this in the next few days and provide you with the info you've requested. What's weirder - I've done a In this case, the file is a .scr file. I'll email over the original + received file at some point (will do a diff myself as well). |
Please find attached a zip containing the .scr from the server, the received version on the Next, and 4k-esp-bank.bin. |
Ah, it looks like I broke my own debugging tool! The bank dump was empty - it should have had useful info in it 🤦. I'll get my debugging tool fixed (oh, you didn't modify the capture-esp.bas did you? it uses a debug build called If the Can I assume you're running the latest build of |
I grabbed the latest code from here before running anything. I've just tried with a different ESP module, and the result is the same (which also has the same firmware version it seems). However, when using a hostname + port for the same file on a remote server - the file downloads perfectly (as you'd expect!) Have you been able to reproduce the issue on your local network? I'm still wondering if it's an issue with http itself or something to do with my network setup (networking - always such fun!) I'm happy to carry on looking in to this. I'd like to get it working locally as it would be an ideal way of pushing code to my Next - just drop the files on a local web server and grab it at the other end (without fiddling with SD cards etc). Not so ideal if I need to upload the files to the web first to download them! Anyway, thanks for taking the time to look in to this. It's much appreciated. Let me know if there's anything else I can do to help. |
What you're doing is all correct - looks like I broke my own test build! I'll fix that up and send it back to you for insights. What I believe is happening under the hood is the the ESP is flushing messages too quickly for the machine to keep up, and so it overshoots the packet marker (the What I don't understand (yet) is why having a larger buffer helps (though now thinking, it might be it allows the code to get enough data, but it's still missed packets and it leaves you with a corrupted file). |
I've just double checked, and yes, larger files (>= 64Kb) do contain the IPD packet markers, so nothing is transferring reliably locally for me - larger files appear to work, but always have the |
Just to let you know I'm still on this - I need to fix up my debug build so it can get us the right information. |
No worries - I appreciate it. |
Firstly, thanks for your work on this. It's very useful!
I think I've found a bug which seems to occur only with very small downloads. I was trying to download the following file, which was on my local web server:
https://github.com/em00k/next-zxdb-downloader/blob/main/zxdb-dl/zxdb-loader.bas?raw=true
I have a PHP server running locally, with the files from here in the root directory:
https://github.com/em00k/next-zxdb-downloader/tree/main/zxdb-dl
Downloading zxdb-dl.bin to the Next worked fine (81Kb), but downloading zxdb-loader.bas (188b) failed.
The command I used to serve the files was:
The command I used to get the file on the Next was:
I see the server return a 200 for the request on my laptop, but http on the Next the fails with "B HTTP read timeout".
I've put the code in a basic program, and retried the download using ON ERROR, but the download never succeeds. The Next is running at 28Mhz, by the way.
I've not experimented to find out what the minimum file size is that will work - that's next on my list.
Let me know if there's anything I can do to help, or if you need any more information.
The text was updated successfully, but these errors were encountered: