Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not grading in random cases with Open edX #3

Open
angonz opened this issue Nov 10, 2022 · 7 comments
Open

Not grading in random cases with Open edX #3

angonz opened this issue Nov 10, 2022 · 7 comments
Assignees
Labels
bug Something isn't working

Comments

@angonz
Copy link

angonz commented Nov 10, 2022

Hi,
We have LimeSurvey 5.3.29+220801 with the latest LTI Plugin, connecting to an Open edX Ironwood instance. We use the standard LTI consumer, graded component.
In many random cases, students complete the form but the component fails to grade. In other cases it works well.

Is there any configuration tip? Other people reporting the same issue?

@adamzammit
Copy link
Owner

Hi @angonz ,

I haven't seen this issue before. Can you please confirm if the grade is stored in a LimeSurvey variable/hidden question as well as being sent to Open edX? If it isn't - can you please configure one to store it also and use that variable/hidden question as the value that gets sent - then we can confirm if the issue is with LimeSurvey or with the plugin.

Adam

@angonz
Copy link
Author

angonz commented Nov 24, 2022

Hi Adam! Thanks for your response. Actually we are only checking that the form has been submitted, to consider it approved. The form is stored in the LimeSurvey server, however the LTI component in Open edX doesn't even record the attempt.

@adamzammit
Copy link
Owner

Thanks again for this . The plugin currently just attempts the connection to the LMS once and doesn't record the outcome of that attempt. I'll add something that will record this so we can at least follow up with responses that haven't come through to the LMS.

@adamzammit adamzammit self-assigned this Dec 1, 2022
@adamzammit adamzammit added the bug Something isn't working label Dec 1, 2022
adamzammit added a commit that referenced this issue Dec 14, 2022
@adamzammit
Copy link
Owner

Hi @angonz
Please try upgrading to the latest 1.1.0 and adding a 7th attribute which will store the outcome. Hopefully this will help discover what is causing some to fail to pass a result.
Adam

@angonz
Copy link
Author

angonz commented Jan 2, 2023

Hi @adamzammit, wish you a happy new year!
Thanks for you response and action. Today I have upgraded the plugin and added the 7th attribute. Let's wait and see the results.

@angonz
Copy link
Author

angonz commented Apr 13, 2023

Hi @adamzammit!
After testing some time, we've found some records have a "1" in attribute 7. Do you know what can this mean?

@adamzammit
Copy link
Owner

A "1" is most likely the result of the function call returning "true" which means that it should have worked successfully.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants