-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SyntaxError #37
Comments
Thanks. Im realizing some of my fixes are not online. I was doing a
redesign to limit the memory use. Which I hope to finish soon and that will
replace the current version.
Great to learn that you are using it!
Best,
Marc
…On Wed, Oct 16, 2024, 18:56 Jess Byun ***@***.***> wrote:
Dear Marc Jan Bonder,
I am sharing what I have noticed to see if other people had the same error.
I have noticed syntax errors in both [run_QTL_analysis.py] and the
[run_interaction_QTL_analysis.py].
The error was below in my log file:
File "Limix_QTL/run_QTL_analysis.py", line 161
print("Feature: "+feature_id+" not tested not enough samples do QTL test (n="+str(sum(~np.isnan(phenotype_df.loc[feature_id,:])))+").")")
^
SyntaxError: EOL while scanning string literal
And I have modified the code to be below:
print("Feature: "+feature_id+" not tested not enough samples do QTL test
(n="+str(sum(~np.isnan(phenotype_df.loc[feature_id,:])))+")")
Another error I have noticed is that while I was running the run_interaction_QTL_analysis.py
The error was below
UnboundLocalError: local variable 'u_snp_matrix' referenced before
assignment
And I have included this before line 618 of the qtl_utilities.py
if not isinstance(snp_matrix_DF, pd.DataFrame):
snp_matrix_DF = pd.DataFrame(snp_matrix_DF)
It seems to be running and providing results :) Let me know if this works
too 👍
—
Reply to this email directly, view it on GitHub
<#37>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AA6IAM5ON4R7OCWFCM5BP2DZ32LEBAVCNFSM6AAAAABQB5FBQ2VHI2DSMVQWIX3LMV43ASLTON2WKOZSGU4TENBZGM2DEMY>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Thank you very much for telling me about memory use. I noticed that 1000 permutations took a long time. I still couldn't figure out the option for the |
Please see the running via snakemake in the wiki. That is meant for this
option. Alternatively you can use the -gr option directly. the block size
is meant to indicate how many variants are being loaded from disk at a
time. This can limit the memory use but requires more time.
I would recommend keeping that fixed.
Best,
Marc
…On Wed, Oct 16, 2024, 19:06 Jess Byun ***@***.***> wrote:
Thank you very much for telling me about memory use. I noticed that 1000
permutations took a long time. I still couldn't figure out the option for
the -blocksize, so I broke down to make my own -feature_filename and made
my block(?) (ex, ten genes at a time) run simultaneously. Would it be great
to know memory usage, etc updates :)
—
Reply to this email directly, view it on GitHub
<#37 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AA6IAMZ4OVXRSA2ZGNZKJU3Z32MKJAVCNFSM6AAAAABQB5FBQ2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMJXGQZDGOBVHE>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Dear Marc Jan Bonder,
I am sharing what I have noticed to see if other people had the same error.
I have noticed syntax errors in both [run_QTL_analysis.py] and the [run_interaction_QTL_analysis.py].
And I have modified the code to be below:
print("Feature: "+feature_id+" not tested not enough samples do QTL test (n="+str(sum(~np.isnan(phenotype_df.loc[feature_id,:])))+")")
Another error I have noticed is that while I was running the
run_interaction_QTL_analysis.py
The error was below
UnboundLocalError: local variable 'u_snp_matrix' referenced before assignment
And I have included this before line 618 of the
qtl_utilities.py
It seems to be running and providing results :) Let me know if this works too 👍
The text was updated successfully, but these errors were encountered: