Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use Doctests #557

Merged
merged 85 commits into from
Nov 27, 2019
Merged

Use Doctests #557

merged 85 commits into from
Nov 27, 2019

Conversation

lrennels
Copy link
Collaborator

@lrennels lrennels commented Sep 30, 2019

This PR addresses issue #548.

Adding Documenter doctests to the Mimi documentation so that the code blocks are run during testing. Doctests will be included as much as possible, including in the tutorials, but for now are not used comprehensively in files such as the userguide.md, since that would require substantial configuring of the file (it has most syntax instructions that don't have enough setup code to fully run).

  • primary files
  • tutorials
    • tutorial 1 - nsteps error fixed in new FUND version
    • tutorial 2
    • tutorial 3
    • tutorial 4
  • internals - no inclusion of doctests in these files for now

@lrennels lrennels changed the title Start including doctests Use Doctests Sep 30, 2019
@codecov-io
Copy link

codecov-io commented Oct 1, 2019

Codecov Report

Merging #557 into master will increase coverage by 0.99%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #557      +/-   ##
==========================================
+ Coverage   79.12%   80.12%   +0.99%     
==========================================
  Files          39       39              
  Lines        2721     2732      +11     
==========================================
+ Hits         2153     2189      +36     
+ Misses        568      543      -25
Impacted Files Coverage Δ
src/core/types/time.jl 100% <ø> (ø) ⬆️
src/core/defs.jl 80.06% <ø> (ø) ⬆️
src/core/defcomp.jl 90.38% <0%> (+0.72%) ⬆️
src/core/connections.jl 89.9% <0%> (+0.91%) ⬆️
src/mcs/defmcs.jl 66.33% <0%> (+0.99%) ⬆️
src/mcs/montecarlo.jl 86.63% <0%> (+4.85%) ⬆️
src/core/show.jl 9.52% <0%> (+7.93%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1018bd6...f8010bc. Read the comment docs.

@lrennels
Copy link
Collaborator Author

lrennels commented Oct 2, 2019

@davidanthoff for the tutorials, using doctests would require pulling down models from our registry. Is there a particular way you would suggest doing that? I don't think I can do it in the Project.toml file, but thought maybe I could do something similar to what you do for testing dependencies or something?

@davidanthoff
Copy link
Collaborator

Hm, good question... I guess we could add the registry here, and then the models can just be listed in the Project.toml in the docs folder?

@lrennels
Copy link
Collaborator Author

lrennels commented Oct 6, 2019

Ah @davidanthoff that's a good idea, do you know the command when you aren't in the >pkg REPL? We'd do using Pkg and then Pkg.add( ? ) ... is there a registry keyword or something?

UPDATE: I'll use this https://julialang.github.io/Pkg.jl/v1.1/api/#Registry-API-Reference-1

using MimiFUND
m = getfund(nsteps = 100)
m = MimiFUND.get_model(nsteps = 100)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ckingdon95 I'm seeing an error after this call now:

ERROR: Mismatched data size for a parameter connection: dimension :time in <ComponentId MimiFUND.scenariouncertainty> has 101 elements; external parameter :scenpgrowth has 1051 elements.

any ideas?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR address this and needs discussion:
fund-model/MimiFUND.jl#44

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah ok thanks!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lrennels should we change this nsteps example since the fix for this isn't merged into FUND yet? Or should we wait? Ideally these blocks would also be doctested right?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you’re right we should change or remove it since it isn’t merged yet and we’d want these blocks doctested.

@lrennels lrennels removed the request for review from davidanthoff October 7, 2019 20:44
@lrennels lrennels dismissed davidanthoff’s stale review October 7, 2019 20:45

Handled the change, Cora and I will handle this branch from here

@corakingdon
Copy link
Collaborator

hmm now it's stalling with the error: "No output has been received in the last 10m0s, this potentially indicates a stalled build or something wrong with the build itself."

Which is the same thing that's happening on your other PR to run Travis on Windows (except that it's also happening here on mac and linux) @davidanthoff

@davidanthoff
Copy link
Collaborator

What an endless nightmare, this branch :)

I think the strategy now is to insert @info statements between the different includes in the runtests.jl to figure out where it stalls...

@corakingdon
Copy link
Collaborator

@davidanthoff we already have those @info statements, so we can see that the last one to give output before it stalled was "test_timesteparrays.jl" which is about 3/4 of the way through all the tests in "runtests.jl". Not sure what to do with that info though

@corakingdon
Copy link
Collaborator

It did successfully run the doctests I think, since I put that at the top of the "runtests" file, so that's good that that's working in the test environment now. So I guess this is now a different Travis problem?

@lrennels
Copy link
Collaborator Author

This branch is my nightmare. Ok so do we know what the commit is that caused Mac Travis to start failing? It’s when we changed the .yml file calls to the registry adds etc?

@lrennels
Copy link
Collaborator Author

Ah ok I see. This might be off base, but I was running into that exact problem when I ran the doctests once before the documentation ran, and then they were run again in makedocs. But I have no idea why. So maybe something in the doc testing is making the environment stall ... but I have no idea. Maybe more info statements will help us.

@lrennels
Copy link
Collaborator Author

lrennels commented Nov 18, 2019

@ckingdon95 @davidanthoff I think there's something going on here with the explorer and broken pipes, which I've seen before. When we run doctest(Mimi), this includes running several calls to explore. Then in the test suite, test_timesteparrays.jl is the first test to call explore.

Trying this locally, if I run doctest(Mimi) and let all of the tests run, and then close all of the windows and external Julia instance etc. for explore, and then try to run the original testing suite I get an error at test_timesteparrays.jl that is familiar:

Got exception outside of a @test
  LoadError: IOError: write: broken pipe (EPIPE)
  Stacktrace:
   [1] wait(::Task) at ./task.jl:198
   [2] fetch at ./task.jl:211 [inlined]
   [3] req_response(::Electron._Application{Electron.Window}, ::Dict{String,Any}) at /Users/lisarennels/.julia/packages/Electron/wL84D/src/Electron.jl:226
   [4] Electron.Window(::Electron._Application{Electron.Window}, ::Dict{String,Any}) at /Users/lisarennels/.julia/packages/Electron/wL84D/src/Electron.jl:291
   [5] #Window#15 at /Users/lisarennels/.julia/packages/Electron/wL84D/src/Electron.jl:309 [inlined]
   [6] (::getfield(Core, Symbol("#kw#Type")))(::NamedTuple{(:options,),Tuple{Dict{String,Any}}}, ::Type{Electron.Window}, ::Electron._Application{Electron.Window}, ::URIParser.URI) at ./none:0
   [7] #explore#224(::String, ::typeof(explore), ::Model) at /Users/lisarennels/.julia/dev/Mimi/src/explorer/explore.jl:34

While I do not see this error in the testing log, it seems likely that this is related. One solution might be to remove tests of explore from the doctests, another might be to move the doctest call to below the testing suite but I'm not sure that would help.

@davidanthoff
Copy link
Collaborator

Can we try the following:

  1. Remove the call to explore from test_timesteparrays.jl, that really shouldn't be there, right?
  2. Mark all code examples in the documentation that call explore as not a doctest, so that they don't run. We can still try to enable that later, but maybe that helps for now.

I tried to look through Electron.jl to figure out how that error that @lrennels reported might come about, but I don't understand. @lrennels could you actually open an issue over at Electron.jl with a replication steps for that pipe error, just so we don't lose track of it?

@lrennels
Copy link
Collaborator Author

Yep I can certainly do both of those things! Per #1, I can move the specific test we are doing to test_explorer_model.jl. The goal is testing to make sure we can call explore properly on a model for which time is not the first dimension, so we will want to keep that test somewhere.

@davidanthoff
Copy link
Collaborator

davidanthoff commented Nov 23, 2019

So I think many of the structural problems are now ok, but now we are running into output comparison tests that depend on a digit very far to the right, and we get slight differences there on different platforms, so the tests fail...

I guess we could a) for now just not compare outputs for these specific tests, just so that we can merge this and then fix remaining problems in smaller follow up PRs, or b) try to output the results with less digits so that we don't run into these problems. But that of course is tricky because it would make our example code much more complicated with something that is really only needed for the tests....

@lrennels and @ckingdon95 in any case, can you two take it from here? Let me know if you run into more of the infrastructure problems again, happy to help with those!

@lrennels lrennels requested a review from corakingdon November 27, 2019 06:29
@davidanthoff davidanthoff merged commit 3a5427d into master Nov 27, 2019
@davidanthoff davidanthoff deleted the doctests branch November 27, 2019 22:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants