Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Quarto fails to render figure layout with subcaptions #6993

Closed
andrewheiss opened this issue Sep 25, 2023 · 13 comments · Fixed by #6997
Closed

Quarto fails to render figure layout with subcaptions #6993

andrewheiss opened this issue Sep 25, 2023 · 13 comments · Fixed by #6997
Assignees
Labels
bug Something isn't working crossref
Milestone

Comments

@andrewheiss
Copy link

Bug description

(This might be related to #6912.)

When using subcaptions with the layout option for custom figure layouts, Quarto gives an error.

Error running filter /Applications/quarto/share/filters/main.lua:
/Applications/quarto/share/filters/main.lua:16847: attempt to index a nil value (field 'content')

Steps to reproduce

This works (no image-specific captions):

---
title: Layout test
---

```{r}
#| label: fig-charts
#| fig-cap: "Charts"
#| layout: "[1, 1]"

plot(cars)
plot(pressure)
```

This works (image-specific captions):

---
title: Layout test
---

```{r}
#| label: fig-charts
#| fig-cap: 
#|   - "Speed and Stopping Distances of Cars"
#|   - "Vapor Pressure of Mercury as a Function of Temperature"
#| layout: "[1, 1]"

plot(cars)
plot(pressure)
```

This doesn't work (image-specific subcaptions):

---
title: Layout test
---

```{r}
#| label: fig-charts
#| fig-cap: "Charts"
#| fig-subcap: 
#|   - "Cars"
#|   - "Pressure"
#| layout: "[1, 1]"

plot(cars)
plot(pressure)
```
Error running filter /Applications/quarto/share/filters/main.lua:
/Applications/quarto/share/filters/main.lua:16847: attempt to index a nil value (field 'content')
stack traceback:
	/Applications/quarto/share/filters/main.lua:17242: in function 'float_reftarget_render_html_figure'
	/Applications/quarto/share/filters/main.lua:13546: in field 'render'
	/Applications/quarto/share/filters/main.lua:670: in local 'filter_fn'
	/Applications/quarto/share/filters/main.lua:226: in function </Applications/quarto/share/filters/main.lua:216>
	(...tail calls...)
	[C]: in ?
	[C]: in method 'walk'
	/Applications/quarto/share/filters/main.lua:148: in function </Applications/quarto/share/filters/main.lua:138>
	(...tail calls...)
	/Applications/quarto/share/filters/main.lua:725: in local 'callback'
	/Applications/quarto/share/filters/main.lua:738: in upvalue 'run_emulated_filter_chain'
	/Applications/quarto/share/filters/main.lua:773: in function </Applications/quarto/share/filters/main.lua:770>
stack traceback:
	/Applications/quarto/share/filters/main.lua:148: in function </Applications/quarto/share/filters/main.lua:138>
	(...tail calls...)
	/Applications/quarto/share/filters/main.lua:725: in local 'callback'
	/Applications/quarto/share/filters/main.lua:738: in upvalue 'run_emulated_filter_chain'
	/Applications/quarto/share/filters/main.lua:773: in function </Applications/quarto/share/filters/main.lua:770>

Expected behavior

There should be subcaptions with images.

Actual behavior

Nothing renders :)

Your environment

  • IDE: RStudio 2023.06.2+561
  • OS: macOS Ventura 13.5.2

Quarto check output

Quarto 1.4.382
[✓] Checking versions of quarto binary dependencies...
      Pandoc version 3.1.8: OK
      Dart Sass version 1.55.0: OK
      Deno version 1.33.4: OK
[✓] Checking versions of quarto dependencies......OK
[✓] Checking Quarto installation......OK
      Version: 1.4.382
      Path: /Applications/quarto/bin

[✓] Checking tools....................OK
      TinyTeX: v2022.08
      Chromium: (not installed)

[✓] Checking LaTeX....................OK
      Using: TinyTex
      Path: /Users/andrew/Library/TinyTeX/bin/universal-darwin
      Version: 2022

[✓] Checking basic markdown render....OK

[✓] Checking Python 3 installation....OK
      Version: 3.11.3
      Path: /opt/homebrew/opt/[email protected]/bin/python3.11
      Jupyter: 5.3.0
      Kernels: python3

[✓] Checking Jupyter engine render....OK

[✓] Checking R installation...........OK
      Version: 4.3.1
      Path: /Library/Frameworks/R.framework/Resources
      LibPaths:
        - /Library/Frameworks/R.framework/Versions/4.3-arm64/Resources/library
      knitr: 1.44
      rmarkdown: 2.25

[✓] Checking Knitr engine render......OK
@andrewheiss andrewheiss added the bug Something isn't working label Sep 25, 2023
@cscheid cscheid self-assigned this Sep 25, 2023
@cscheid cscheid added this to the v1.4 milestone Sep 25, 2023
@cscheid
Copy link
Collaborator

cscheid commented Sep 25, 2023

Thank you! We'll fix ASAP.

@cscheid
Copy link
Collaborator

cscheid commented Sep 25, 2023

Another immediate workaround, if you're looking for one, is to not use layout: [1, 1]. (It's clearly a bug on our side, but this will stop Quarto from crashing for the next few hours...)

@andrewheiss
Copy link
Author

Oh that's just for the sake of the reprex—IRL it's a more complex multipanel layout like [[70,30], [100]], but I've just turned off the captions for now, so it's all good :)

@andrewheiss
Copy link
Author

Oh, and idk if it's related to this issue or if I should open a new one, but layout-valign seems to be emitting a quarto-layout-valign-top class regardless of whether the option is top, center or bottom

@cscheid
Copy link
Collaborator

cscheid commented Sep 25, 2023

Oh, and idk if it's related to this issue or if I should open a new one, but layout-valign seems to be emitting a quarto-layout-valign-top class regardless of whether the option is top, center or bottom

Probably a new one. (I ripped out all of the old crossref to support #4944 and we're going through a bit of regression churn. I apologize!!)

@radovan-miletic
Copy link

Hi @cscheid,

I'm not sure if this is the case for opening new issue...
I'm getting the same error with option eval: false:

---
title: Layout test
---

```{r}
#| label: fig-charts
#| fig-cap: "Charts"
#| fig-subcap: 
#|   - "Cars"
#|   - "Pressure"
#| layout: "[1, 1]"
#| eval: false

plot(cars)
plot(pressure)
```

Thank you!

@cscheid
Copy link
Collaborator

cscheid commented Oct 1, 2023

Ok, we do need to fix that error, but I'm not sure what output you're expecting to get if the code cell itself has eval: false

@radovan-miletic
Copy link

Thanks @cscheid !
I'm using the option eval: false when I need to switch between chunks without evaluating one of them.
In my understanding, with eval: false we can tell Quarto not to run the code and generate output for one particular code chunk.

@mcanouil
Copy link
Collaborator

mcanouil commented Oct 1, 2023

@radovan-miletic Have you considered using project profiles and conditional contents?

@radovan-miletic
Copy link

Thanks @mcanouil , it's certainly worth exploring!

@cscheid
Copy link
Collaborator

cscheid commented Oct 2, 2023

I'm using the option eval: false when I need to switch between chunks without evaluating one of them.

Ok. But what I meant is that your code cell has labels, captions, and layout options, but with eval: false, we have no content with which to build a figure.

@radovan-miletic
Copy link

radovan-miletic commented Oct 2, 2023

I should have been more explicit.
Up to the v.1.4.366, eval: false had the ability to instruct Quarto not to run the code / generate output for one particular chunk, even with labels, captions, and layout options.
After that version, eval: false still can tell Quarto not to run the code, with the exception of fig-cap & tbl-cap, as far I can see.
I just noticed that behavior after your last comment.
It could be another issue, if not intentional.

@cderv
Copy link
Collaborator

cderv commented Oct 3, 2023

@cscheid @radovan-miletic I opened a new issue at #7118 to continue discussion there in its proper thread. It seems there are some changes to discuss.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working crossref
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants